Data Science Breakdown

You have undoubtedly heard that Data Science is one of the fastest growing fields in the data industry and one of the best jobs in America .  While many people are interested in a career in data science, they are afraid it might take more than they have to offer.  I was one of these people.  I was afraid that I didn’t have the knowledge or mental (or mathematical) aptitude needed for such a career.  Being the unwavering person I am, I set a goal to learn more and then went on a search for information.  I found the Microsoft MPP in Data Science program and thought “Well, I can at least give it a try.” 

          *Let me pause here and applaud Microsoft for partnering with EdX.org to assemble and bring in this training… and making it available to anyone and everyone for FREE.  You can take and complete these classes for free.  The only payment needed is if you decide to complete the classes for verified certificates (needed to complete the MPP Certification).

What it takes

Are you interested in studying data science?  Ask yourself these questions:

  • Do you have an interest in exploring abstract ideas?
  • Are you a curious person?
  • Do you feel comfortable seeking for answers in unique ways?
  • Do you love exploring with new programs and technology?
  • Are you interested in finding the story within the story?
  • Are you good at finding patterns where there seem to be only random ideals and images?
  • Are you interested in working with data?

What is a Data Scientist?

If you answered yes to the above questions, you just might be the next great Data Scientist!  Let’s break down what a Data Scientist does.  The role of  data scientist is a unique one as it requires an ability to think on your feet, think outside the box, be creative with technology, and be somewhat of an entrepreneur.  Data science walks the fine line between technology and creative story telling.  A data scientist is one who knows how to use various means to pull narratives from data to create a great story.  You see, data is not merely a static table of letters and numbers.  No, it is much more than just digits in a row.  Data is a living, breathing, ever-evolving collection of information that is searching for a way to tell its’ story.  Data scientists are curious, technically equipped story-tellers exploring the data landscape for the next great story to share.  Sound interesting?  If so, keep reading!

Data Science Tools

On my journey to becoming a Microsoft MPP in Data Science, I started where we all start… at the beginning.  The very first class in the MPP Course is Introduction to Data Science.  This is your typical intro class.  It is easy, but very important.  This will guide you through what to expect, how to navigate the classes, as well as provide an over-view of the basic concepts and principles on which data science is based.  

There are a number of tools in the data science repertoire.  For the purpose of this blog, we will focus on the tools one can learn through the Microsoft MPP courses. 

  Analyzing & Visualizing Data

The first tool we look at is for analyzing and visualizing data.  The MPP course gives you a choice between working with Power BI or Excel.  As I have previous experience with Excel and feel pretty confident there, I chose to learning something new and went with Power BI.  I found Power BI to be a super fun tool that felt more like a video game and less like work.  I love a good visual!  This class easily walked me through setup and through a variety of use-case scenarios.  I found it very fun and easy to learn.  In fact, what struck  me the most about these classes is how very concise yet easily followed the class are.  

Communicate Data Insights

Now that you understand the basics of analyzing and visualizing data, it is important to know how to master data communication.  It is one thing to be able to look at data and understand it, it takes a completely different set of skills to convey the stories the data has to tell.  In the next course, Analytics Storytelling for Impact, you will learn how to fully explore a story to find what a great story is, and what it is not. This course really dives into how to make an impact through storytelling and gives you an idea how to create impact through presentations, reports and how to apply these skills to your data analytics.  I thoroughly enjoyed this class as it spoke to the theater major in me.  I do love to tell a good story, and this class gave me new ways to look at data and has resulted in me questioning things I see every day like political polls, job descriptions, and advertisements. 

Apply Ethics and Law in Analytics

Ethics?  What does ethics have to do with being a data scientist?  Admittedly, when I first saw that the program had been updated with Ethics and Law in Data and Analytics I was a bit taken aback.  I thought I had left the legal field and was on the way to a technical role.  Why learn ethics?  Data science, and data collection have changed wildly and quickly over the last few years.  It is my firm belief that every data professional needs to take this course.  Only through taking this course I learned about the possibility of data being accidentally  prejudice!  Certainly ethics should be considered when collecting and analyzing data!  The data scientist would be remiss in not heeding due diligence!

Query Relational Data

The data scientist must know how to query databases in order to get the data needed to analyse.  The MPP program offers Querying Data with Transact SQL where you will learn to query and modify data in SQL Server or Azure SQL using TSQL.  If you are not familiar, SQL is pronounced in the industry as “See-Quil”  not “Es-Que-El”… it is a pet peeve of mine to hear someone say S-Q-L when talking to me about SQL.  This course was very thorough and a great way to step into learning how to query and program using TSQL.   This class will take some effort, I found it to be one of the more intensive classes in this course.  SQL is no easy task, and SQL Server has many versions out there in practical use, each version with different hurdles to jump.  This particular class is a fantastic place to start to learn a great deal about SQL.  

Explore Data with Code

The next step in the program is to explore data with code.  You are given two options here, one path is Introduction to R for Data Science and the other is Introduction to Python for Data Science.  For my interests, I chose Python since it is widely used in many areas, especially advanced analytics and AI.  To my surprise, Python was a lot of fun to learn.  I did more research into the uses of Python and found it to be a very useful tool in my toolkit.  I can even design and program holiday lights for my house using Python!

Apply Math and Statistics to Data Analysis

Whoa, wait….math?  Math is involved???  Yes, absolutely!  Remember back in school when you thought “When will I EVER use this again in real life?”  The answer is “Now, and always, honestly.”  There are three classes offered here so you can choose which you want to learn:

I chose the Python Edition to continue on my usage of Python from the last class.  I was not a great math student, so I was really afraid I would not be smart enough to get through this class.  If  you are feeling that way, stop that now.  Like I have said before, these classes are designed in such a great way that not only was I able to learn and grow, I made a great grade!  Don’t let fear of failure keep you from trying something new.

Plan and Conduct Data Studies

Again you are given the choice to learn Data Science Research Methods: R Edition and Data Science Research Methods: Python Edition.  No matter which path you choose, this class teaches the fundamentals of the research process.  You will learn to develop and design sound data collection strategies and how to put those results in context.  

Build Machine Learning Models

To be honest, I faced this particular class with dread.  Much to my surprise, I really and truly enjoyed learning about building machine learning models. You can chose between Principles of Machine Learning: R Edition and Principles of Machine Learning: Python Edition .  If you have previously chosen Python as I did, continue on with that path.  This class offers a clear explanation of machine learning theory through hands-on experience in the labs.  You will use Python or R to build, validate and deploy machine learning models using Azure Notebooks.

I will make one suggestion though, before completing this class, would recommend completing  Developing Big Data Solutions with Azure Machine Learning .  As a more visual-based person, I found that I understood the machine learning models much more after completing the course using Azure Machine Learning.

Build Predictive Solutions at Scale

Okay, now we are getting to some really fun stuff! I think this was my absolute favorite of all the classes.  You can chose from one of these three:

I chose Developing Big Data with Azure Machine Learning (AML) and what a blast I had!  I can say that working with AML and with Azure Data Studio was like opening up presents on my birthday!  The final projects were a lot of work, but I got a real sense of what working in the field as a data scientist and machine learning is all about… trial and error.  It was a lot of fun trying to use insights, hunches, best guesses, and technology all together to create and train a model in order to accurately predict solutions!  

Final Project

After all the courses are completed and passed, you can only gain the MPP in Data Science if you successfully pass the Microsoft Professional Capstone : Data Science.  As of the writing of this blog, I am slated to begin the Capstone on December 31, 2018 and I cannot think of a better way to ring in the new year!

Final Thoughts

I have researched many ways to become a Data Scientist.  Most universities offer degrees in data science.  I have found that on the majority of their sites, they tout a Masters or PHD in Data Science is what you need (with a heavy prerequisite of extensive math and stats classes) in order to become a data scientist.  Must you have an advanced degree in mathematics or engineering to become a data scientist?  Absolutely not.  You don’t even have to hold a degree to work as a data scientist!  Take a look at this article published on Forbes: 4 Reasons Not To Get That Masters In Data Science

My advice is to take a look at the Microsoft MPP program and try on a few of the free classes.  If you are truly interested in a data science career and are willing to put forth the time and attention needed to learn, you already qualify as a good candidate.  Don’t let your past dictate your future.  Make the investment in yourself and grow along with the technology as it comes.  You can do this!  

What is RPO/RTO?

Time is money!

     Your boss keeps talking about RPO (Recovery Point Objective)  and RTO (Recovery Time Objective).  Do you just nod your head like you know what he/she is talking about?  Maybe that scenario just happened and you are searching the internet for what these terms mean.  If so, welcome!  No one likes to think about disasters, but they happen all too often.  Planning for the worst and hoping for the best will keep your data safe and your job even safer. Let’s take some time and explore what RPO and RTO mean, why these things are important, and what you need to do next to be a hero DBA!

RTO (Recovery Time Objective)

     Recovery Time Objective is the amount of time in which your company expects you to have the database fully restored after a disaster.  That is, how much downtime is acceptable for disaster recovery or planned outages.  Each company is different, and most reference RTO in terms of nines. 

     The nines measure for a company that measures 365 days a year, 24 hours a day as follows:

5 9’s – 99.999% (this translates to about 5 minutes of acceptable downtime per year)
4 9’s – 99.99% (this translates at about 52.5 minutes per year and is much easier to achieve)
3 9’s – 99.9%  (this translates at about 8.75 hours per year)
2 9’s – 99% (translates to about 3.5 days a year)

     To decide what RTO is best for your company, you need to take into consideration your data needs.  Not all companies run on a 365/24 schedule.  Some companies only measure downtime between 8am-6pm Monday through Friday, or only on the weekends.   This will drastically change the translation of the 9’s.  Another thing to think about is whether the measured downtown includes time for maintenance or patching, times when the database must be offline.  If maintenance time is eliminated from consideration, meeting the higher 9’s is much easier.

     If your company insists on an RTO of 5 Nines and does not take into consideration maintenance or patching, then you must speak with the persons in charge to discuss the RPO.  It is possible to adhere to the strict 5 minutes of downtime, but the point at which you are able to recover, will definitely be restricted.

RPO (Recovery Point Objective)

     Recovery Point Objective is the level of data or work that is acceptable to lose in the event of a disaster.   Ideally, companies will want ZERO data or work loss.  While that IS achievable, it will all depend on  valid backups and the extent of damage the database suffered at the point of disaster.  

     An RPO of 15 minutes means that the data and work must be recoverable to a point within 15 minutes of the disaster, meaning that it is expected that only 15 minutes of work or data may be lost.  Stop right here and think about your backup  plans and recovery modelsRestoring a database that is in simple recovery model should not take as long as a restoring one in full recovery model. It is important to remember (from previous blog posts), the recovery model dictates how much data you can recover. It is also important to remember, the ability to recover ANY data at all is fully dependent on having valid backups.

Run Book

     Another term you might hear is “Run Book”.  A Run Book is a physical or digital collection of information that is needed to restart the database in case of disaster.  There are many items that should be included in the runbook.  Some of the essential items one should consider having in the runbook are:

  • Server level info, configuration, purpose, etc.
  • List of all databases and applications using them
  • List of agent jobs and proper response to a failure
  • Disaster Recovery process with all contacts, RPO/RTO, etc. required to bring it back (based on level of issue)
  • Security
  • Backup schedules

     When considering a run book, think about what someone would need if they were new to the company and the only person available to restart the database.  What information would that person need?  Making sure your run book is up to date on a regular basis is certainly a great idea!

Preparing for disaster

     Keep in mind that if you prepare for the worst, you will be less likely to be caught off-guard with a manager breathing down your neck asking “WHEN WILL WE BE BACK UP AND RUNNING?!?!”  Do you have any idea how long it will take to restore your database?  If your answer is “no,” I would suggest doing a restore to see how long this takes.  Further, I would suggest making it a habit to perform drills so that you and your team know what to do in the event of a disaster, and exactly how long it takes to get your company back up and running. Having a solid backup schedulevalidating those backups, and keeping your company’s expectations in mind, you will be ready to handle any data disaster that may be thrown your way.  

*Originally posted at Procure SQL:
https://www.procuresql.com/blog/2018/08/31/what-is-rpo-rto/

 

Tail Log Backup

In my SQL Server Recovery Models blog, I touched a bit on my experience with recovery using Tail Log Backup.  In this post we will take an in-depth look at Tail Log Backup; what they are, why they should be in your toolbelt, and lastly line out the steps to successfully take a Tail Log Backup.

What is a Tail Log Backup?

Simply put, a Tail Log Backup contains log records that were not yet backed up at the time of failure.  So if Transaction Log Backups occur every 15 minutes, and you suffered loss at the 11 minute mark, the Tail Log Backup includes all data changes during the time span between the last successful Transaction Log Backup and minute 11.  Recovery using the Tail Log backups can be performed in either Full Recovery or Bulk Logged Recovery, but cannot be used in Simple Recovery Model. 

Why are these important?

Is it possible to recover with no data loss?  YES*!  This is where our new friend comes into action!   Taking a Tail Log backup is done to prevent data loss and helps recover right up to the point of disaster. (This is also referred to as Point In Time Restore.)

Keep in mind: in order to recover your database to its most recent point, you must have a valid Full Backup and valid Transaction Log Backup sequence!

After a disaster, if you can take a Tail Log Backup, have all the preceding Transaction Log Backups, have a valid Full Backup, and you are in Full Recovery mode, it is possible to recover with NO DATA LOSS!  For this to be possible in Bulk-Logged Recovery mode, no minimally logged operations must have occurred.

In what case would you ever need a Tail Log Backup?

Any time you have a damaged database and are needing to restore, it is best to check to see if you need a Tail Log Backup.  The question you need to ask is “Do I have Transaction Log Backups?”   If the answer is yes, your recovery will be much faster!  Another question to ask is “Is the Server still available?”

Server Still Available

If the database is damaged but the server is still available, it is pretty easy to take a Tail Log Backup.  When the data files are damaged or missing, you will get an error if you try to take a normal log backup.  But if you use NO TRUNCATE, you will be able to take a Log Backup.

BACKUP LOG [TestDB] TO DISK = 'G:\DBA\Backups\TestDB_Log_Tail.bck' ;
WITH INIT,
NO_TRUNCATE;

*Note:  In order to successfully take a Tail Log Backup, you must use NO TRUNCATE. That will allow the log backup even if the database files are damaged or not there.  Using INIT will overwrite any existing set and you will still end up with only one backup in case the command is run twice.

Server Not Available

Let’s say the server has crashed and cannot be brought back online.  If you are lucky enough to still have access to all the Full Backups and Log Backup files, you can attach them to another server and automatically recover.

If the database is damaged and the server is not available, taking a Tail Log Backup becomes a little more difficult. Rest assured, there is still an option to try.

You will need to create a dummy database with the same name as the one that is damaged.

  1. Next, set the database offline.
  2. Delete the files from the dummy database.
  3. Drop in the log file from the real database.

--Create a dummy database with the same name
CREATE DATABASE [TestDB];
GO
--Set the database offline
ALTER DATABASE [TestDB] SET OFFLINE;
GO
--Insert Log file from original database--Take Tail Log Backup
BACKUP LOG [TestDB] TO DISK = 'G:\DBA\Backups\TestDB_Log_Tail.bck' ;
WITH INIT,
NO_TRUNCATE;
GO

Now you are ready to take a Tail Log Backup as detailed above.  This will allow you to recover to the point of failure!  In my next post, we will do a deep dive into Recovery Using Tail Log Backups.

Thank you for reading!

 

*Originally posted at Procure SQL:
https://www.procuresql.com/blog/2018/09/19/tail-log-backups/

 

Deep Dive into Bulk Logged Recovery Model

    Full disclosure time: Bulk Logged Recovery Model is  quite confusing to me.  And as it seems, to many others.  I wrote a bit about it in SQL Server Recovery Models and decided that it was so complex, I really wanted to learn more and to explore what works and what doesn’t.  Let’s take a deep dive into bulk logged recovery!

Deep Dive into Bulk Logged Recovery

Why would you choose Bulk Logged Recovery?

    Switching from full recovery to bulk logged recovery does have its perks when you have a very large amount of data to insert.  Most notably in a data warehouse setting, switching to bulk logged recovery to perform bulk inserts make perfect sense as you are dealing with very large amounts of data being updated at one time.  Also, when doing an index rebuild switching to bulk logged recovery can improve performance while performing operations on large amounts of data at once.

Are there better ways to insert a large amount of data at once?

     Bulk Logged Recovery uses minimal logging for bulk-logged operations, this reduces log space usage. I must add a caveat here; it makes it faster and reduces the usage in the file, but it results in a very large log backup. 

“Under the bulk-logged recovery model, if a log backup covers any bulk operations, the log backup contains both log records and the data pages that were changed by bulk operations. This is necessary to capture the results of the bulk-logged operations. The incorporated data extents can make a log backup very large.” Reference Myths & Truths

     However, there is a risk of data loss for the bulk operations because these operations prevent capturing changes on a transactional basis.  A point in time recovery (PIT) while using bulk logged recovery is not possible because the minimally logged operations cannot be restored.  This can be an issue. So, if you have a bulk operation that needs to be handled, but you want to ensure point in time restore of each transaction in that operation, what is an alternative solution?  It is important to note that you can indeed recover a transaction log containing bulk logged operations but not to a particular point in time.  Instead you can take a transaction log backup as soon as the bulk operation is finished and regain PIT recovery.

     You can still perform bulk operations in full recovery model, it just means that they will be fully logged and that you will not see the performance gains from minimal logging.  It is the price you have to pay, you sacrifice performance for PIT restore abilities of the transactions within the bulk operation.  Ultimately your decision will have to be based on weighing what your company demands for I/O and RPO/RTO (Recovery Point Objective/Recovery Time Objective).  Do you know what your company’s RPO/RTO plans entail?  Now is a good time to find out!

     Feeling uneasy?  Wondering if there are there other ways to process bulk operations?

     There are different methods one can utilize for optimizing bulk operations.  These methods include using minimal logging, batching, disabling triggers and constraints, and many others that can be found here

How best to ensure limited data loss using Bulk Logged Recovery.

    So you decide you are going to use bulk logged recovery and you want to make sure that you are set up for success, there are a few things to keep in mind.  It is recommended that you perform bulk inserts using bulk logged recovery when there is the least amount of activity on your database.  Also take into consideration how difficult or easy it will be to recreate data if there is a failure during the bulk insert.  There is no PIT restore of the bulk operation using bulk logged recovery.  If the bulk operation is interrupted at any point, the entire operation must be performed again in its entirety.

Still want to proceed?

Wait!   

First, before you switch from full recovery, take an extra log backup.  If all things go badly, at least you will be able to get your database back to the point before you switch recovery models.  This is highly recommended!  If not, this is what we call an RGE (resume generating event).

     Let’s walk through the process of getting ready and switching recovery models.  Our first step in this exercise is to create a table.  We then go on to taking a log backup, inserting data manually, taking a log backup, and then on to switching to bulk logged recovery.

This is a great visual of what we will be doing.

 


--Step 1--

Use BLRDB
GO

DROP TABLE dbo.BulkDataTest;

CREATE TABLE dbo.BulkDataTest
(Price money NULL,
ProductID int PRIMARY KEY NOT NULL,
ProductName varchar (25) NOT NULL,
ProductDescription text NULL)
GO

BACKUP DATABASE LOG [BLRDB]
TO DISK ='C:\DBA\Backups\BLRDB_Log_02232018_0800.bak'

-- Step 2--
Insert into dbo.BulkDataTest
(Price, ProductID, ProductName, ProductDescription)
VALUES ('456', '456123', 'HeroBike', 'Red Bike with Hero Cape Handles');

-- Step 3 --

BACKUP LOG [BLRDB]
TO DISK ='C:\DBA\Backups\BLRDB_Log_02232018_0815.bak'

-- Step 4 --
-- Switch to Bulk Logged Recovery Model --
USE [master]
GO
ALTER DATABASE [BLRDB] SET RECOVERY BULK_LOGGED WITH NO_WAIT
GO
-- verify Recovery mode  This script will ensure that you are in the desired recovery model--
select name, recovery_model_desc from sys.databases

Our next steps will be to insert our bulk data, insert manual data, take log backups, switch back to Full Recovery, and take an additional log backup.

-- Step 5 --
-- Commit both transactions below at the same time--
USE [BLRDB]
GO

BULK INSERT BulkDataTest
FROM 'C:\DBA\TestDocs\demo_bulk_insert_26.csv'
With (FIELDTERMINATOR = ',' ,
ROWTERMINATOR = '\n' ,
ROWS_PER_BATCH = 100000,
TABLOCK
);
GO

INSERT INTO BulkDataTest
(Price, ProductID, ProductName, ProductDescription)
VALUES ('1099', '1111111', 'HoverCraft', 'BippityBoppityBoop');

Select *
From dbo.BulkDataTest

-- Step 6 --
--take log backup--

BACKUP LOG [BLRDB]
TO DISK ='C:\DBA\Backups\BLRDB_Log_02232018_0900.bak'

-- Step 7 --
--Insert more rows manually--

INSERT INTO dbo.BulkDataTest
(Price, ProductID, ProductName, ProductDescription)
VALUES ('56', '111117', 'TheCheap', 'One of the cheapest bikes ever made'),
('58' , '111118', 'NewerModel', 'This one is for beginners'),
('591' , '111119', 'ABetterOne', 'Okay this one is good') ;

-- Step 8 --
-- Switch back to Full Recovery Mode--
USE [master]
GO
ALTER DATABASE [BLRDB] SET RECOVERY FULL WITH NO_WAIT
GO

Use BLRDB
GO
-- Step 9 --
--Insert more rows manually--
INSERT INTO dbo.BulkDataTest
(Price, ProductID, ProductName, ProductDescription)
VALUES ('36', '111120', 'BoyBike', 'This is a bike for tall 8yo'),
('136', '111121', 'ManBike', 'This is a bike for tall men'),
('236', '111122', 'ShortBike', 'This is a bike for under 5foot');

-- Step 10 --
--Take Log Backup--
BACKUP LOG [BLRDB]
TO DISK ='C:\DBA\Backups\BLRDB_Log_02232018_0930.bak'

Ready for a challenge?

Now let’s simulate a dropped database and walk through restore! The following steps will walk you through to be able to answer the questions below.

Question 1: Restore backup to step 6.  What is missing?  Do you have the single row inserts?  Is the data from the bulk insert there?


--Drop Database--
USE Master
GO

DROP DATABASE [BLRDB]

--Restore Full Backup --

RESTORE DATABASE BLRDB
FROM DISK ='C:\DBA\Backups\BLRDB_Full_02232018.bak'
WITH NORECOVERY;

RESTORE DATABASE BLRDB
FROM DISK ='C:\DBA\Backups\BLRDB_Log_02232018_0800.bak'
WITH NORECOVERY;

RESTORE DATABASE BLRDB
FROM DISK ='C:\DBA\Backups\BLRDB_Log_02232018_0815.bak'
WITH NORECOVERY;

RESTORE DATABASE BLRDB
FROM DISK ='C:\DBA\Backups\BLRDB_Log_02232018_0900.bak'
WITH NORECOVERY;

RESTORE DATABASE BLRDB
WITH RECOVERY;

use [BLRDB]
go

SELECT *
FROM dbo.BulkDataTest

--For Qustion 1, Restore Step 6 --

USE Master
GO

RESTORE DATABASE BLRDB
FROM DISK ='C:\DBA\Backups\BLRDB_Full_02232018.bak'
WITH NORECOVERY;
GO

RESTORE DATABASE BLRDB
FROM DISK ='C:\DBA\Backups\BLRDB_Log_02232018_0800.bak'
WITH NORECOVERY;
GO

RESTORE DATABASE BLRDB
FROM DISK ='C:\DBA\Backups\BLRDB_Log_02232018_0930.bak'
WITH NORECOVERY;
GO

RESTORE DATABASE BLRDB WITH RECOVERY;
GO

use [BLRDB]
go

SELECT *
FROM dbo.BulkDataTest

Question 2: Restore to backup at step 10.  What is missing?  Do you have everything?


-- For Question 2, Restore Step 10 --
-- Drop DB--
USE Master
GO

DROP DATABASE [BLRDB]
GO

USE Master
GO

RESTORE DATABASE BLRDB
FROM DISK ='C:\DBA\Backups\BLRDB_Full_02232018.bak'
WITH NORECOVERY;
GO

RESTORE DATABASE BLRDB
FROM DISK ='C:\DBA\Backups\BLRDB_Log_02232018_0800.bak'
WITH NORECOVERY;
GO

Restore Database BLRDB
FROM DISK ='C:\DBA\Backups\BLRDB_Log_02232018_0815.bak'
WITH NORECOVERY;
GO

Restore Database BLRDB
FROM DISK ='C:\DBA\Backups\BLRDB_Log_02232018_0900.bak'
WITH NORECOVERY;
GO

Restore Database BLRDB
FROM DISK = 'C:\DBA\Backups\BLRDB_Log_02232018_0930.bak'
WITH RECOVERY;

USE BLRDB
GO

SELECT *
FROM dbo.BulkDataTest

Swimming back up.  Swimming Back Up

     We have taken the plunge and now it is time to swim back to the surface and discuss what we have found.  Working through this exercise and answering these questions should show you how bulk logged recovery model works, why it is important to switch back to full recovery, and most importantly, why full and log backups are a must!

     What are your answers?  Your answers should show that if you are only able to restore to step 6 (the point at which you took a log backup but had not yet switched back to full recovery and taken another log backup) all data operations that were performed since your bulk operation have to be recreated!  The second answer should show you that it is imperative to take a log backup before switching to bulk recovery, take a log backup after your bulk insert, and take another log backup after reverting back to full Recovery.  If you follow those steps, your answer shows that all of your data operations were in fact logged and can be restored up to the last log backup without data loss.

*Originally posted on Procure SQL at  https://www.procuresql.com/blog/2018/07/18/deep-dive-into-bulk-logged-recovery-model/

 

Paint Your Success

I like to listen to people.  Really listen.  Listen to what they are saying and what they are NOT saying.  Sometimes the answers are clear and other times the answers are hidden.  One conversation in particular struck me.  It was a conversation between two people and one was complaining that he was not being heard by his team.

This made my mind start thinking about ways in which we are either not heard, or are dismissed.  Perhaps this person was having that issue because he lacked clear communication of his ideas, or was not seen as a strong member of the team.  Verbal, written, and non-verbal communication are key to gaining ground in the areas you desire.

Communication Styles

Non-verbal communication is the ice breaker.  Think about the way you appear walking into a meeting.  Is your head held high?  Do you smile? Do you look people in the eyes when you speak to them?  The very first instance someone sees you is your first opportunity to make the impression you desire.  This first impression will set the tone for the discussion. Walk in confident and friendly.  Your audience will take note that you are there to do business and are invested in communicating with them.  They will feel welcomed and at ease.

Verbal communication skills truly are an art form.  Being able to craft sounds and thoughts into words and paragraphs comes easy to some. For others, this is a craft that seems daunting and elusive.  Being an extrovert helps me excel in my communication skills.  I am largely unafraid to speak with new people.  I thrive in meeting new people and sharing ideas.  However, for those who are more introverted, verbal communication is very uncomfortable and often crippling.  If you are one who has trouble with verbal communication, consider honing your written communication skills while you work on gaining comfortable levels with your verbal communication skills.

Written communication can emphasize your point, make an idea memorable, or totally destroy a project.  Of course we want to give our projects as much growth potential as possible, so being able to formulate a well written idea is just as important as verbal communication.  Written communication is tricky because the reader is not afforded the opportunity to see the facial expressions of the writer.  So often times, jokes come off as harsh criticisms.  The tone of a person’s writing is just as important as the words the writer uses to convey a message.

Painting

Bring Forth Your Inner Picasso

Think of creating a painting.  Before creating your masterpiece, you must start with a properly prepared canvas.  Stretching muslin onto a wooden frame is the first step to creating a painting.  The second step is to temper the muslin with a wash of water-thinned paint.  This will allow the muslin to harden and provide a strong base for the paint to stick.  The next step is to allow your creativity to flow through colors, shapes and movement.  Before you know it, you have created a lovely piece of artwork that is truly showing of who you are.

Being Picasso

That confused look on your face is one that I see often.  It is okay, I speak in odd metaphors at times.  I paint life with an interesting brush.  Let’s explore how you can become Picasso of the Morning Stand-up and get your ideas heard!

  • First, start with the way you enter the room for the meeting.   This is your non-verbal communication that sets the stage for the meeting or conference.  Head up, smile, look people in the eyes.  Assert your position in the room and welcome others into your space. (Stretch your muslin onto the frame and temper the canvas.)
  • Second, allow your ideas to flow through well-crafted words and examples.  Show data to back up your claims.  Explain what reports mean, show the ups and downs.  Give your audience the desired information in the most direct and informative way. (Paint that beautiful masterpiece!)
  • Third, a nice written follow up will concrete your ideas and give your audience a physical takeaway.  Concentrate on the content and intent of your proposal.  Pay attention to your tone and remember that there may be someone reading the report who was not in the meeting to see experience your passion.  (Hang that masterpiece in a gallery!)

Communication is just one of the many soft skills that can help propel your career.  I have a session called Become the Most Valuable Player: Soft Skills for the Hard Market.  This year I have been focused on presenting this session at various SQL Saturdays around the country.

Finding my voice

For years I have been so very proud of my husband (T/B) Lance Tidwell, for getting past his fear of public speaking, for putting himself out there, for going all over the US and in Canada giving presentations regarding Microsoft SQL Server issues at SQLSaturdays, User Groups, and even PASS Summit.  I have secretly been somewhat envious of him.  I love to travel, meet new friends, have adventures, and share my thoughts and experiences.   I could do this too, right?  RIGHT?

Let’s back way up.

Many years ago I received my Bachelors in Fine Arts.  I studied theater and was on stage quite a bit!  I acted, sang, directed, designed and built sets and wardrobes, etc.  I was very outgoing and felt that I was living larger than life. After graduating, my life hit a pretty large speed-bump and it threw me for a big loop.   I fought to regain myself; however, somewhere along the way in climbing back up, I lost sight of who I was and what I wanted in life.

Instead of chasing my dreams and doing what I enjoyed most, I settled for a stable career in the legal field; where I grew and changed, and helped others live out their dreams.  I was quite good at what I did, make no mistake.  I give 100% of myself to every task!  But it was never really “my thing” and I longed for something more exciting.

Mid-life crisis at MY AGE?

Call it a mid-life crisis, call it divine intervention, call it pure boredom….but I suddenly decided I needed a change in my life.  I wanted to dream big, I wanted to learn and grow in a field where constant change and growth is…well….constant.  Being fairly adept in the technical realm, my husband urged me to take some classes to learn Data Sciences.  I found some great classes at EDX and set on my way.  I loved those classes!

I began attending SQLSaturdays with him and met the most wonderful, supportive, intelligent, fun, and brilliant people!  They welcomed me with open arms and encouraged me to grow and learn with them.  I was becoming part of the wonderful #SQLFamily! I even received my first SQL job offer at Procure SQL with a great mentor and friend, Microsoft MVP, John Sterrett (T/B).  Fast forward a year or so and I submitted to give a session at my first SQLSaturday! I was accepted as a speaker for SQLSaturdayNashville!  OH MY!

Koalified

I am koala-fied to do this!

Oh my, how exciting!  Oh my, how wonderful!  OH MY, what have I done?

That’s right…I did it. I made a speaker profile, I attached a cute photo, I wrote an abstract and then I hit the all-mighty SUBMIT button.  There was no turning back!  And then bam, “Congratulations, you have been accepted to speak at..” the words made my heart race, my head swim, my inner theater kid jumped for joy!  I did it! I was accepted!

I was accepted….oh my. OH MY!  I need to write a session!  Where do I start?  What do I do?  What have I done?  What would they think of me?  Would my session be any good? Would anyone even show up?  The thoughts, fear, and doubt swirled like snow in the wind.  I just had to stop and take a breath, get my thoughts in order, and put that theater kid to work!

The steps I took to build the session and my confidence, the experience giving the session, and the outcome will all be provided in the next blog.  Stay tuned!  20180116_181710