I thought I would do some cleaning up in my SSDT solution which turned out a bad idea.
In my solution I have three database projects which all started with the name “Database”. It seemed a good idea at the time because I also have several unit test projects having the same name as the database projects so I started those projects names with “UnitTestAdventureWorks” and “DatabaseAdventureWorks”.
I changed my mind as the icons for the database projects and unit tests are different and it just looked a lot more cleaner without the names Database and UnitTest in their project names.
Once I got rid of the prefixes and started to build…BAM, could not found project file “DatabaseAdventureWorks”. Of course not, I changed it a minute ago.
My first idea was to rename the project again to “DatabaseAdventure”. Visual Studio asked me to change the database in localDB as well. I said yes hoping it would rollback everything. Not only did it change the database names with the database prefix but also the project settings. I’m fine with the project settings but not the database changes.
Changing the name of the database in SSOX (SQL Server Object Explorer) back to normal (who wants their database named DatabaseAdventureWorks anyway? We know it’s a database!); Visual Studio did not ask me to change the project settings.
I have the impression that settings between project settings, SSOX and Solution Explorer are not automatically updated. Also building the project does not help.
If anyone has some good advice or a workaround, shoot!
If you’re like me, you find blogging very hard because of the following reasons:
There is so much misinformation about SQL Server on the Internet and I don’t want to be part of that. Does this mean I suck? I don’t think so (I hope not) but I personally think if you’re going to write a blog post that the entire SQL Server community is going to might read, it might as well be good…damn good!
One of the possible topics I would like to write about is errors, scenarios or whatever you end up with at work. So sure, there’s lot to blog about but I don’t want to make screenshots of potential security wise issues regarding my professional setup. Yes, we can blur some things out but seriously, let’s take a screenshot of a Visual Studio solution for example, it will contain almost 100% data which should not be in the open.
I hate it to write about topics that has been written about dozens of times already.
I wish I could blog more often, I even invested in a dedicated hosting, but I see very rare occasions. When I blog, it will usually be about topics I know I will forget because my long-term memory fails on me and If I can share with you, why not? That’s a bonus! Right?
Whenever you are using a command line tool like bcp.exe in an execute process task, the command line window will pop-up and close very fast which makes it hard to debug.
A small trick is to put the executable in a batch file and throw in a pause and call the batch file from within the process task. Upon executing, the command line window appears and will continue after you press a key.
The batch file might look like this:
the %* accepts all the parameters passed from within SSIS to bcp.exe.
Whilst debugging an SSIS package and tracking the values of variables in the locals window, I noticed old variables which I deleted earlier in the project.
I did a rebuild, closed and open the solution. I even rebooted the machine! Every single time the old variables appeared although they were removed and then …it hit me! The checkpoint file! I deleted the existing checkpoint files and voila, the old variables no longer appeared…
I was tasked with a simple easy peasy transaction replication setup (is there such a thing?) between two countries.Came in, took a cup of coffee, Rammstein on the headphones and thought it was going to be a smooth morning. What could go wrong…
When I “tried” to configure the distributor, the wizard threw the following error at me:Property LoginSecure cannot be changed or read after a connection string has been set. (Microsoft.SqlServer.ConnectionInfo)
Mmm, almost all answers on the web took me to the following solution:
Right click local publication followed by Distributor properties
Select the publisher in the list, click the browse button
In agent connection mode, select “impersonate the agent process account”
Well yes, I am looking at it. “impersonate the agent process account” seems to be the default setting.
I was totally lost at that moment and I knew it was going to be a long “short morning”.
I wanted to generate scripts for replication although I knew no replication was setup at the moment but I just wanted to try for the heck of it.
The generate scripts wizard came up with a warning saying I had to use the proper server name, no IP address or alias and there it was…SSMS was connected to the instance using an IP address. Once I connected using the server name…no more errors when finishing the new distributor wizard.
The answer is no. At least not since SQL Server 2008.
Have a look at your model database on a new instance, the Full-Text check box is checked and greyed out.
Stopping the SQL Full-text Filter Daemon Launcher isn’t going to work either to get rid of the setting.
Even the ALTER DATABASE command on BOL doesn’t say anything about it except changing the default full-text language.
The reason is very simple. Since SQL Server 2008, Full-Text Search or FTS is build in the database engine and therefore, all databases are full-text enabled by default. However, keep in mind Full-Text Search is an optional feature during the setup of SQL Server.
If you’re like me, the first thing you do when they put you in front of a SQL Server instance you have never worked on before is looking for misconfigurations or simple put: let’s see how they messed up…
I keep running into the following bad practices:
Auto-Shrink and Shrink jobs Seriously? Probably the holy grail of bad SQL Server practices. Read more about it.
The world wide web service is running.
Why? Give me a reason why you would use the www service on a dedicated SQL Server instance in a production environment?
Yes, there might be a short list of reasons but let’s be honest, how many instances are out there running this service? Indeed…
Give me one good reason why you need to have Internet access? Do you check your mail on the production SQL Server? You play Angry Birds during lunch?
Yes, you know I’m talking about you. Why do you keep using RDP on the SQL Server instances?
As a database administrator, I can’t think of any tasks you won’t be able to execute from a remote machine.
It’s a waste of resources you’re using on a production machine.
Client tools installed on the server.
Yes go ahead, install SSMS and BIDS on the server so the guys in China can do their development using RDP…
Yes, install all instances using the same account so if one breaks down, all services break down.
Use a dedicated domain account with no rights to run the services instead.
Service accounts with sysadmin privileges.
Why? An ordinary domain account should do it. It’s a security threat.
Power plan set to balanced.
Sure, let’s save some power running this very important heavily queried instance.
Setting the power plan to high performance actually boosts the performance of the instance.
It’s improving but I still see a lot of production instances with only one drive.
If it’s up to me, I would even use dedicated drives for indexes.
Use dedicated disks for mdf, ldf and TempDB. Don’t forget to put the local backups on a dedicated drive as well. Taking backups is a write process and you don’t want to query (read) data from the same disk which is writing to a backup.
I try to maintain a 15% free disk space on all drives.
Disk space is filled up starting at the center of the disk. A longer distance from the center of the disk means a lower speed.
There are plenty more of those but 10 is a nice number.
Join experts Kurt Roggen (MVP) and Mike Resseler (MVP) for the Private Cloud Roadshow.
In this half day you will learn more about private cloud infrastructure setup and how you can monitor this. Learn how to create your private clouds and how to deploy standardized applications or services into these clouds. And as a final session you will learn how you can provide automation in your private cloud.
There are 2 options to attend (same content, different location):
This TechNet event is free of charge and Microsoft will be giving away one Nokia Lumia 800 Windows Phone in each location. Detailed agenda:
13:00 : 13:30 Welcome & Coffee
13:30 : 14:30 Building your Private Cloud Infrastructure
Learn how to build your Private Cloud infrastructure, by dealing with Fabric Management (Compute/ Hypervisors, Storage, Network), which will serve as the basis for the Private Cloud that you will be creating. We will discuss how to deploy, configure and manage each of these different elements in your datacenter.
14:30 : 14:50 Break
14:50 : 16:30 Creating, Monitoring & Operating your Private Cloud
Learn how to create your private clouds and how to deploy standardized applications or services into these clouds. Learn how to monitor your clouds and how to can handle change requests. All this key area’s will be addressed to give you an idea of what is happening in a private cloud after it is up, running, and into production.
16:30 : 16:50 Break
16:50 : 18:00 Automating & Delivering Services in your Private Cloud
Learn how you can provide automation in your private cloud. Discover also how your cloud services can be offered and consumed using different self-service portals and what their differences are.
We’ve all been there, an end user sends you an e-mail: “it does not work”….ok, what does not work? And how or what exactly did you do?
I recently found out there is a small nifty tool which comes with Windows 7: Problem Steps Recorder or simply PSR.
It runs from the command prompt or Run box, simply type PSR. You hit Start Record and you start reproducing the error or the test scenario. It takes a screenshot of every click and even adds a description like:
Problem Step 1: (2/04/2012 21:00:02) User left click on “Object Explorer Hierarchy (outline)” in “Microsoft SQL Server Management Studio”.
It even highlights the sections you clicked or changed (see green rectangle in screenshot below).
After stopping the recording, a MHTML file gets zipped and you have the option to save or send the file by e-mail. It’s not only an excellent way of reproducing errors or test scenarios, it can also be used to document.
Another edition of TechEd Europe is coming up on 26-29 June, Amsterdam
Four days of more than 1,000 education opportunities and hands-on training across the full range of Microsoft products and solutions including 400 breakout sessions spread across 12 tracks and 250 self-paced labs created by product experts.