I chose not source my data directly from Maryland’s State Government site because the format was not easy to use. The official Maryland Government provided data basically has each day as a column and had the rows as Zip Codes — not as easy as the data provided from the site above. So there may be few discrepancies between the data on a day to day basis, but the totals are identical. You can read about their methodologies of retrieving data from various official State Government websites and the quality of each.
This post is in no way intended to attack anyone, be part of a political movement, promote any agendas political/financial/social, or support any causes out there except one: highlight how data and statistics can be used to tell stories. As data professionals, we need to give importance to the quality of data and the quality of data reporting. Basically this is a lesson in data visualization and telling stories with data.
I presented an introduction to Azure Databricks on May 22, 2020 to one of our local SQL Server User Groups here in the Washington DC area. It was done online due to the Covid19 restrictions on gatherings. The good that came out of doing it online was that it was recorded and published on YouTube. Here is the video, I hope you enjoy learning!
Ever wanted to know the reason behind the Database color scheme in SQL Server Management Studio?
I have always wondered why all the Databases in management studio were Yellow (Online), Grey (Read-Only), Lighter Yellow with text next to it (something like Restoring), or Yellow with a Red Icon (Offline). Have you ever wondered why there aren’t any more useful colors like Red for when a Transaction Log is Full or Purple when your Statistics are Outdated. I have a theory on this that I have entitled “The Theory that you have to actually work to call yourself a DBA.”
For the past couple of months I have been working with Ben Weiss a Digital Marketing Strategist at Infusive Solutions (http://www.infusivesolutions.com/). Our partnership was based on a mutual understanding that there is much needed career growth and guidance among IT Professionals. Since my background is primarily with Database Management Systems I have been helping him with writing articles, guides, and even interview questions for top professionals in the field, all for the purpose of furthering the careers of IT professionals.
I received an email from a connection on LinkedIn seeking some advice for architecture on a new reporting system. Although I don’t know the gentleman personally, he is a SQL Server professional and I felt obliged to help out a fellow colleague in the field. The email is included below and my suggested solutions following it.
A part of every DBA’s job is to justify all those disk space requests. The phrases “I just need it” or “Trust me this database is going to grow one terabyte this year” might not convince your boss.
Many people are under the impression that disk space is cheap since they base that notion off of the shell shocker they saw on NewEgg or the weekly circular from Best Buy. If you are using consumer SATA drives from retail outlets for your production databases please do the following:
1. Update your resume
2. Apply for jobs at least 1000 miles from your current location
3. Hope the next DBA that replaces you has more common sense
We’re in the times of SANs, iSCSI, SAS drives, SSD drives and RAID architecture! You should be forking out almost the cost of your server in disk space. Unless of course losing data is an option where you work.
Now that I’ve bored you or intrigued you (or something in between) I can get to my point. There is a wonderful tool that comes with SQL Server 2008 and later called the Management Data Warehouse (MDW) also known as the Data Collector.
Every wonder who all your orphaned SQL Server users are on your server? Just run the script below and copy the results of the query into a new query window then execute them. Make sure you keep the spacing and the quotes as they are so that the Dynamic SQL generated is free of Syntax errors.
USE master SELECT'USE '+ name+ ' EXEC sp_change_users_login ''Report'' ' FROM sys.databases WHERE name not in ('master','msdb','model','distribution','tempdb') ORDER BY name
But wait a minute, I only know which orphaned users are on my server. I want to actually fix them! Ok, ok relax, I have the solution for that just keep reading.
One of the cool new features with SQL 2012 is the Sequence Object. One of the worst parts of this feature is that whenever I mention it’s a cool new feature a lot of people say “that has been in Oracle forever.” Patience is a virtue!!
Anyway, so I like to consider the Sequence Object as a “Global Identity” that can be shared by many objects in the same database, or even across databases on the same server (I have not tried it via Linked Servers because I only have one test SQL 2012 server). You can take values from the Sequence Object and insert them into tables or you can have a column in a table default to a value from the Sequence Object. A warning to the wise: If you assign a default value for the table as a Sequence Object then you cannot drop it because of dependencies. That is actually a pretty good security measure.
I have included a code snippet so that you can play around with this new feature and I’ve put a few screen shots to show how you can manage it via the GUI. Make sure you run the code section by section; I have included a lot of comments to let you know what to expect while running it.
One of the biggest issues we have in health is poor monitoring. This is also the case in health of many of our systems. As a DBA, I’m always concerned about the health of my database servers; mainly because I would like to keep my job and not get the 3am call that something has gone terribly wrong. However, with that said, no matter how careful we are there is always the probability of something bad going wrong.
This is part three of my lessons learned and will have more of a focus on Database monitoring and Disaster recovery.
This is somewhat of a quick or lazy way to truncate and shrink the transaction log because all you have to do is replace the database name and change your size requirements. It’s not the cleanest or best approach I admit, but sometimes I find myself on a development or staging server and I need some space quickly. There is no requirement on the server to preserve historic data, and since it is not highly transactional I don’t care what is on the log file to begin with. This happens often when running data aggregations in our DataMart environment where log files can grow up to 100GB+ with staging data that is not useful after the process is complete.