Nov 092018
 

As part of the PASS Summit conference session that Bob Pusateri and I are presenting right now called “Cosmos DB for SQL Server Pros“, we are demonstrating a number of Azure Cosmos DB features that we feel strongly about helping you leverage through code. As a result, all of the code that we’re demonstrating today is now available for you to download and explore.

This code demonstrates how to collect Windows Perfmon metrics with C#, package them up into a Cosmos DB document, insert the document into Cosmos DB, and then fetch and query individual and multiple documents based on a SQL-like interface demonstration.

As always, there’s no warranty with this code, please don’t distribute as your own, yada yada yada. I’m not the best programmer, so if anyone has improvements, I’d love to have you send me your updates to check out!

Nov 062018
 

What a whirlwind year this has been! I’m thrilled to invite you to my sessions at this year’s PASS Summit, happening this week in Seattle WA.

 

Tuesday 11:30am to 1:00pm I’m part of the cloud availability panel over lunch at the SentryOne Data Day precon at the Summit. I hope you already registered for this exciting opportunity to learn more about performance and availability-related topics in this whole-day discussion.

Next, for those of you that still have your mission-critical workloads in your on-premises datacenter (or even for those who have them in the cloud, because almost all of the topics directly apply!), come to my session Wednesday morning at 10:45am entitled “On-Prem SQL Servers, Interstellar Performance“. This deep-dive on virtual SQL Server performance will help you squeeze the most performance from your virtualized SQL Servers, and will help you communicate with your infrastructure team on things that matter the most to performance-oriented SQL Servers. You’ll be sure to have quite a bit of real-world tips and tricks to take back to the office as soon as you return from the conference.

Thursday morning I’m part of the SIOS PASS the Bacon breakfast panel, where we’ll get a good chuckle while we discuss a number of topics pertinent to today’s DBA, such as availability both on-prem and in the cloud, disaster mitigation and preparation, and other availability topics. Come register here, get a great breakfast on the house, and learn while you laugh!

Next, Bob Pusateri and I have a session Thursday afternoon called “SELECT STARS: A SQL DBA’s Introduction to Azure Cosmos DB”, where we discuss and present Microsoft’s new Cosmos DB offering on the Azure platform. This platform is a completely different model of databases than what most SQL Server professionals are accustomed to, and we’ll give you the details you need to expand your horizons and learn about this exciting platform. This session is more of the ‘what’ session to get you amped to learn more about the platform on Friday.

Friday my final session of the conference is entitled “Cosmos DB for SQL Server Pros”, designed to give you more of the ‘how’ when continuing your exploration of Cosmos DB. Bob Pusateri and I will share more of the use cases, scalability details, and live code demos of how to access and alter data in a Cosmos DB database.

I hope to see you at the show! If you want to sync up and talk geek, make sure to ping me to set up some time over your favorite form of caffeine.

 

 

 

 

 

Jul 242018
 

I hope to see you all at this year’s VMworld 2018 USA conference where I’m lucky enough to have been selected to present four SQL Server-themed sessions!

The first is an all-day boot camp (VAP3768WU) with Oleg Ulyanov from VMware Corp. where we take participants through a comprehensive journey of understanding how to manage, performance tuning, and business continuity planning for high performance SQL Servers on the VMware virtualization platform. It’s the Sunday before VMworld officially ramps up, and if you have infrastructure engineers attending this conference, please tell them to register for this boot camp.

Next, Thomas LaRock from SolarWinds (@sqlrockstar) and I will be presenting a session called “Performance Deep Dive for Demanding Virtual Database Servers” (VAP1425BU). We’ll be discussing advanced performance-related topics such as CPU scheduling and vNUMA, “right-sizing” for both performance and licensing, virtual disks and performance discrepancies, and general operational efficiency.

Next, Michael Corey from LicenseFortress (@michael_corey) and I will be presenting a session called “Database Virtualization (Monster VMs) with VMware vSphere: Doing IT Right” (VAP1296BU), which is an updated version of our continued track of performance-related tips and tricks. Come see us laugh our way through over a hundred slides packed full of things you can take back to your environments!

Finally, I have a solo session called “Database vDisks and You” (VMTN5520B), where I’ll talk about all of the various layers of queueing between the SQL Server database and the enterprise SAN, and how to measure and monitor these layers, plus discuss ways to performance tune each layer.

As always, it’s going to be a great show! Ping me if you want to meet up in person while at the show, and let’s geek out and talk shop! Register for this conference here, and make sure to enroll in the SQL Server on VMware boot camp!

 

 Posted by at 10:44 am
Apr 102018
 

I am proud to present an in-person presentation for the New England SQL Server Users Group entitled “Level Up Your Cloud Infrastructure Skills” on Wednesday, April 11, at 6pm Eastern.

Abstract: Think infrastructure in the cloud is still just for sysadmins? Think again! As your organization moves into the cloud, infrastructure skills are more important than ever for DBAs to master. Expert knowledge of cloud-related infrastructure will help you maintain performance and availability for databases in the cloud. For example, know what an IOP is? How many does your database consume during a given day? Properly sizing a cloud database depends on your knowledge of this metric. Failure to properly configure storage performance at the time of deployment will slow down your SQL Server considerably. Come learn many of the key cloud infrastructure points that you should master as the DBA role continues to evolve!

RSVP for this webinar here! Bring your questions!

Mar 012018
 

You might have noticed that I’ve been pretty quiet as of late. We’re working on a super top secret internal project here at my company, and we’ve got the need to ingest a LOT of data around the clock for some analytics work. My preferred DBMS is, of course, Microsoft SQL Server, and like a lot of DBAs, we want to make this swiss army knife of a relational DB platform do everything we can dream up. Thankfully, it can perform most of the tasks we throw at it pretty well. But, the pragmatist in me asks – “Is this the best tool for the job?”. Because we’re just starting this project, we can step back a bit and look at all of our options.

For our project, we do not want to deal with a datacenter of our own. Yes, we’re known as on-prem virtualization enthusiasts, and there in certainly many reasons for keeping things on-prem for some time to come, but cloud is the right choice for us for this project. We’re working on the cloud platforms just as much as we are on-prem these days, and we’re seeing the shift occurring in the industry.

Take a look at the costs of SQL Server licensing in the cloud. To design a SQL Server that can consume upwards of a few million data points a minute, we’re likely to need to spend quite a bit of capital on this platform. It’s just overkill for a straightforward ingestion then export platform. Then, we need to accommodate high availability, disaster recovery, reporting, and analytics needs.

Hmm.

Cloud brings some differences that might be advantageous here. We’re partial to MS Azure as a cloud platform for our company internally, so what does Azure have that can help us?

Azure Cosmos DB.

Now, it’s not as simple as that. Cosmos DB is a collection of APIs for different database types under the hood.

Each one are used differently, and all of the options include many differences in operation and architecture. Of the five listed platform APIs, which should we use? That’s a good question. For this particular project, we want the ability to store tons of inbound data and then will be pulling it out for analysis. Azure Table API seems to work best for this purpose.

SO! Over the next few months, expect a number of blog posts from me here exploring Azure Table on Cosmos DB and the questions, challenges, and experiences we have on ramping up on this new platform.

By continuing to use the site, you agree to the use of cookies. more information

The cookie settings on this website are set to "allow cookies" to give you the best browsing experience possible. If you continue to use this website without changing your cookie settings or you click "Accept" below then you are consenting to this.

Close