# Wednesday, August 13, 2008

Microsoft Visual Studio Team System 2008 Database Edition (aka “Data Dude”) provides tools for managing and deploying SQL Server databases. 

In our last tutorial, we described how a database developer would use the Schema Compare tool to update a database project with changes to a SQL Server database.

This article describes how to use the Schema Compare tool to push those changes out to a different SQL Server database.  There are two scenarios where you would do this. 

In Scenario 1, a developer has a local copy of the development database and wishes to get the latest updates to the database. 

In Scenario 2, a database administrator (DBA) or build master who is charged with migrating database changes from one environment to the next.  Just as .Net and web code gets regularly migrated from a development environment to a QA or Production environment, database object code must also be migrated, and that migration generally must be kept in sync with all code that depends on those database objects.

We start this process by launching Visual Studio and opening the database project.  If a source code repository such as TFS is used, we need to get the latest code from the repository.

The database to which we wish to write the changes is known to Data Dude as the “target database”.  We need to make sure that a connection exists in Visual Studio to the target database.  This is a one-time step and you can use the Server Explorer (View | Server Explorer) to create a connection.

The following steps describe how to propagate changes to the database.

1.       Launch Visual Studio and open the database project.  Get the latest source code from your source code repository.

2.       From the Visual Studio menu, select Data | Schema Compare | New Schema Compare.  The New Schema Compare dialog displays.

    Figure 1

3.       Under Source Schema, select the Project radio button and select your database project from the dropdown list.

4.       Under Target Schema, select the Database radio button and select the connection to your database from dropdown list.

5.       Click the OK button to display the Schema Compare window.

    Figure 2

6.       The Schema Compare window lists every object that exists in either the database or the database project.  The objects are grouped in folders by object type (Tables, views, stored procedures, etc.)  You can expand or collapse a folder to view or hide objects of that type.  The important column is “Update Action” which describes what will happen if you write the updates to the target.

a.       Objects that exist in the source (the project) but not in the target (the database) were likely recently added after the last synchronization.  By default, the Update Action will be “Create” meaning the object will be created in the target database.

b.      Objects that exist in both the source and the target will have an Update Action of “Update” if they have been modified in the database since the last synchronization or “Skip” if they have not.

c.       Objects that exist in the destination (the database) but not in the source (the project) were likely dropped after the last synchronization. By default, the Update Action will be “Drop” meaning the object will be removed from the database.

7.       On a database with many objects, it is useful to view only the objects that have changed since the last synchronization.  To do this, click the Filter toolbar button and select Non Skip Objects. 

    Figure 3

8.       If you wish, you can modify the Update Action on objects by selecting the dropdown in the “Update Action” column.  Some actions are grayed out because Data Dude will not allow you to perform any action that would violate referential integrity rules. 

9.       After you have set the “Update Action” of every object appropriately, you have a couple options.

a.       You can migrate your changes immediately to the target database by clicking the “Write Updates” toolbar button.  Click Yes at the confirmation to write the updates to the database project.

b.      Alternatively, you can export your changes to a SQL script by clicking the Export To Editor toolbar button.  This will create a single text file containing SQL script that you can run from a query window of SQL Server Management Studio.  This is useful if you need to make changes to the script prior to executing.  I have used this technique when my database contains views or stored procedures that refer to remote servers and I want to modify the name of the server before migrating the object.

Alternatively, you can deploy changes from a database project to a database by “Deploying” the project (select Build | Deploy Solution).  This deploys your changes using the settings found on the Build tab of the project properties page.  This method requires fewer steps, but it is less flexible than the method described above.  In particular, it does not allow you to select which objects are deployed or export and modify the script of database changes.

In the next article, we will discuss how to use Data Dude to write Unit Tests against SQL Server stored procedures.

.Net | SQL Server | VSTS
Wednesday, August 13, 2008 12:46:54 PM (GMT Daylight Time, UTC+01:00)
# Tuesday, August 12, 2008

Microsoft Visual Studio Team System 2008 Database Edition (aka “Data Dude”) provides tools for managing and deploying SQL Server databases. 

In our last tutorial, we described how to create a new database project based on an existing SQL Server database.  As the source database changes, you will want to update the database project to reflect those changes.  This article describes how to use the Schema Compare tool to import database schema changes into your database project.  The steps in this article are typically performed by a database administrator (DBA) or database developer who is charged with creating tables, views, functions and stored procedures.

The Schema Compare tool can be used to display and manage differences between two databases, between two database projects, or between a database and a database project.  Most of the time, I use it to compare a database with a database project. 

After making a change to a database schema (for example, adding a new table or adding a new column to a table), use the Schema Compare tool as described below to update an existing database project with these changes.

1.       Launch Visual Studio and open your Database Project. (For info on how to create a database project from a SQL Server database see: http://www.davidgiard.com/2008/08/11/DataDudeTutorial1CreatingADatabaseProject.aspx )

2.       From the Visual Studio menu, select Data | Schema Compare | New Schema Compare.  The New Schema Compare dialog displays.

    Figure 1

3.       Under Source Schema, select the Database radio button and select the connection to your database from dropdown list.

4.       Under Target Schema, select the Project radio button and select your database project from the dropdown list.

5.       Click the OK button to display the Schema Compare window.

    Figure 2

6.       The Schema Compare window lists every object that exists in either the database or the database project.  The objects are grouped in folders by object type (Tables, views, stored procedures, etc.)  You can expand or collapse a folder to view or hide objects of that type.  The important column is “Update Action” which describes what will happen if you write the updates to the target.

a.       Objects that exist in the source (the database) but not in the target (the project) were likely added to the database after the last synchronization.  By default, the Update Action will be “Create” meaning the object will be created in the database project.

b.      Objects that exist in both the source and the target will have an Update Action of “Update” if they have been modified in the database since the last synchronization or “Skip” if they have not.

c.       Objects that exist in the destination (the project) but not in the source (the database) were likely dropped from the database after the last synchronization. By default, the Update Action will be “Drop” meaning the object will be removed from the database project.

7.       If you are updating your database project frequently, most of the objects will be unchanged and marked “Skip”.  On a database with many objects, it is useful to view only the objects that have changed since the last synchronization.  To do this, click the Filter toolbar button and select Non Skip Objects 

    Figure 3

8.       At this point, you can view differences and you may wish to modify the Update Action of some objects.

a.       If you click on an object row in the Schema Compare window, the SQL definition code of both the source and destination version appears in the Object Definition window.  Any differences between the two versions will be highlighted (changed lines in darker blue; new lines in darker green).

b.      If you like, you can modify the Update Action any object by selecting the dropdown in the “Update Action” column.  Some actions are grayed out because Data Dude will not allow you to perform any action that would violate referential integrity rules.  If several developers are sharing the same development database, you may wish to skip those objects on which you are not working.  You may also decide that some objects are ready to share with the rest of the team and others are not fully tested and should be skipped.  It is possible to change the Update Action of every object of a given type by right-clicking the type folder and selecting the desired action to apply to all objects of that type.

9.       After you have set the “Update Action” of every object appropriately, you can migrate your changes to the database project by clicking the Write Updates toolbar button.  Click Yes at the confirmation to write the updates to the database project.

    Figure 4

10.   If you are using a source control repository, such as TFS, you will want to check in your changes.

In the next article, we will discuss how to use the Schema Compare tool to write changes to a new database environment.

.Net | SQL Server | VSTS
Tuesday, August 12, 2008 1:40:04 PM (GMT Daylight Time, UTC+01:00)
# Monday, August 11, 2008

Microsoft Visual Studio Team System 2008 Database Edition (aka “Data Dude”) provides tools for managing and deploying SQL Server databases.  In order to use Data Dude to manage an existing SQL Server database, the first step is to create a database project. 

There are a couple key points you will need to know before using Data  Dude.

1.       The current version of Data Dude only works on SQL Server 2000 and SQL Server 2005.  Visual Studio 2008 Service Pack 1 should provide support for SQL Server 2008.  I will describe an example using SQL Server 2005.

2.       The validation engine in Data Dude requires that you install either SQL Server or SQL Express on the same machine on which Data Dude is installed. 

3.       You must grant “Create database” rights in this database engine to the currently logged-in user.

Now, let’s discuss how to create a database project to manage an existing SQL Server 2005 database.

1.       Open Visual Studio. 

2.       Select File | New Project… The New Project dialog displays

3.       Under Project Type, select Database Projects\Microsoft SQL Server

4.       Under Templates, select SQL Server 2005 Wizard.

5.       Enter a meaningful name and location for this project. 
Typically, my databases have names like “AdventureWorks_Dev” and “AdventureWorks_QA” which describe both the data and the environment to which the data belongs.  Because a single database project is used for all environments, I name my database project to describe the data and follow it with “DB” to make it obvious it is a database project.  In the above example, I would name my database project “AdventureWorksDb”.  In this exercise, I’ll create a project named “TestDB”.
New Project dialog
  
Figure 1

6.       The New Database Project Wizard displays with the Welcome screen active.

7.       At the Welcome screen, click the Next button to advance to the Project Properties screen.
Project Properties screen
  Figure 2

8.       I almost never change the options on the Project Properties screen.   If my database contains any stored procedures or functions written in C# or VB.Net, I will check the Enable SQLCLR checkbox. 

9.       Click the Next button to advance to the Set Database Options screen.
Set Database Options screen
  Figure 3

10.   The options on the Set Database Options screen correspond to the settings you will find in SQL Server Management Studio when you right-click a database and select Properties.  The defaults in the database project wizard are also the defaults in SQL Server.  Since I seldom override these defaults in SQL Server, there is usually no reason to change them on this screen.

11.   Click the Next button to advance to the Import Database Schema screen.
Import Database Schema screen
  Figure 4

12.   On the Import Database Schema screen, check the Import Existing Schema checkbox.  This enables the Source database connection dropdown.  If you already have created a connection to your database, select it from the dropdown.   If you have not yet created a connection, click the New Connection button to create one now.  The process for creating a database connection in Visual Studio hasn’t changed for several versions of the product so I won’t repeat it here.  However it is worth noting that, although Data Dude requires that you have a local installation of SQL Server, the database you connect to here can be located on any server to which you have access.  I always usually connect to the Development database because this is the first database I create for an application.

13.   Click the Next button to advance to the Configure Build and Deploy screen.
Configure Build and Deploy screen
  Figure 5

14.   The Configure Build and Deploy screen contains settings that will take effect when you “deploy” your database project.  Deploying a database project writes changes to the schema of a target database (specified in the Target database name field of this screen) and is accomplished by selecting the menu options Build | Deploy with the database project open and selected.  Deploying is most useful when each developer has his own copy of the development database and needs a quick way to synchronize his schema with a master copy.  

15.   Click the Finish button to create the database project initialized with schema objects found in the source database and with the settings you chose in the wizard screens.

16. After Visual Studio finishes creating the database project, view the objects in the Solution Explorer (Select View | Solution Explorer).  You should see a "Schema Objects" folder in the project containing a subfolder for each type of database object.  Open the "tables" subfolder to view you will see files containing scripts for each table in your database.  Double-click one of these script files to see the SQL code generated for you.
Database project in Solution Explorer
    Figure 6

17. If you use a source control repository, such as TFS, you will want to check this project into the repository to make it easier to share with others.

As you can see, when you use the wizard to create your project, most of the work is done for you.  You are able to change the default settings, but in most cases this is not necessary.  Often, the only change I make on the wizard screens is when I select a database connection.

In the next article, we will discuss how to use the Schema Compare tool to bring data changes into or out of your database project.

.Net | SQL Server | VSTS
Monday, August 11, 2008 3:12:22 PM (GMT Daylight Time, UTC+01:00)
# Sunday, August 10, 2008

Visual Studio Team System 2008 Database Edition is a mouthful to say, so a lot of people affectionately call it “Data Dude”.

Data Dude provides a set of tools integrated into Visual Studio that assist developers in managing and deploying SQL Server database objects.

There are four tools in this product that I have found particularly useful: the Database Project; the Schema Compare tool; the Data Compare Tool; and Database Unit Tests.

A Database Project is a Visual Studio project just as a class library or ASP.Net web project is.  However, instead of holding .Net source code, a Database Project holds the source code for database objects, such as tables, views and stored procedures.  This code is typically written in SQL Data Definition Language (DDL).  Storing this code in a Database Project makes it easier to check it into a source code repository such as Team Foundation Server (TFS); and simplifies the process of migrating database objects to other environments.

The Schema Compare tool is most useful when comparing a database with a Visual Studio Database Project.  Developers can use this tool after adding, modifying or deleting objects from a database in order to propagate those changes to a Database Project.  Later, a Database Administrator (DBA) can compare the Database Project to a different database to see what objects have been added, dropped or modified since the last compare.  The DBA can then deploy those changes to the other database.  This is useful for migrating data objects from one environment to another, for example when moving code changes from a Development database to a QA or Production database.

The Data Compare is another tool for migrating from one database environment to the next.  This tool facilitates the migration of records in a given table from one database to another.  The table in both the source and destination database must have the same structure.  I use this when I want to seed values into lookup tables, such as a list of states or a list of valid customer types that are stored in database tables.

Unit tests have increased in popularity the last few years as developers have come to realize their importance in maintaining robust, error-free code.  But unit testing stored procedures is still relatively rare, even though code in stored procedures is no less important than code in .Net assemblies.  Data Dude provides the ability to write unit tests for stored procedures using the same testing framework (MS Test) you used for unit tests of .Net code.  The tests work the same as your other unit tests – you write code and assert what you expect to be true.  Each test passes only if all its assertions are true at runtime.  The only difference is that your code is written in T-SQL, instead of C# or Visual Basic.Net. 

There are some limitations.  In order to use Data Dude, you must have either SQL Server 2008 or SQL Express installed locally on your development machine and you (the logged-in user) must have "Create Database" rights on that local installation.  To my knowledge, Data Dude only works with SQL Server 2000 and 2005 databases.  Plans to integrate with SQL Server 2008 have been announced but I don't know Microsoft's plans for other database engines.   I also occasionally find myself wishing Data Dude could accomplish its tasks more easily or in a more automated fashion.  I wish, for example, I could specify that I always want to ignore database users in a database and always want to migrate everything else when using the Schema Compare tool.  But overall, the tools in this product have increased my productivity significantly.  Nearly every application I write has a database element to it and anything that can help me with database development, management and deployment improves the quality of my applications.

.Net | SQL Server | VSTS
Sunday, August 10, 2008 1:34:41 PM (GMT Daylight Time, UTC+01:00)
# Saturday, August 9, 2008

When applications service a large number of simultaneous users, the developer needs to take this into account and find ways to ease the application’s bottlenecks. 

One way to help speed up a stressed application is to load into memory resources that will be requested by multiple users.  Reading from memory is much faster than reading from a hard drive or a database, so this can significantly speed up an application. 

However, each computer contains a finite amount of memory, so there is a limit of how much data you can store there.

Microsoft Distributed Cache (code named "Velocity") attempts to address this problem.  It allows your code to store data in an in-memory cache and it allows that cache to be stored on multiple servers, thus increasing the amount of memory available for storage. 

Velocity even ships with a provider that allows you to store a web site's session state, making it possible to increase the amount of memory available to your session data.

Microsoft has not yet published a release date for Velocity, but it is available as a Community Technology Preview (CPT).  You can download these bits and read more about it at http://code.msdn.microsoft.com/velocity.

The current CTP is not production ready - I had trouble keeping the service running on my Vista machine - but the technology shows enough promise that it is worth checking out.  When the glitches are fixed, this will make .Net an even more appealing choice for developing enterprise applications.

Saturday, August 9, 2008 2:38:56 PM (GMT Daylight Time, UTC+01:00)
# Friday, August 8, 2008

I am a recent convert to Agile methodologies. 

Until last year, I worked for a large consulting company that had established a solid reputation using a waterfall approach to deliver solutions.

My current employer is committed to the agile methodology SCRUM.  They have developed their own variation of SCRUM and several consultants here have even made a name for themselves delivering presentations on this methodology to customers and at conferences.

So it's only natural that I have been engrossed in SCRUM since joining.  Nine months of agile software development have sold me on its benefits. 

The biggest advantage I see to SCRUM is the short delivery schedule pushed by the sprints.  For those who don’t know, a sprint is a set of features scheduled for delivery in a short period of time (typically 1-4 weeks).  A sprint forces (or at least encourages) frequent delivery of working software and provides a great feedback loop to the development team. 

When users can actually see, touch and use functioning software, they don't just get value more quickly - they are able to evaluate it more quickly and provide valuable feedback.  That feedback might be a rethinking of original assumptions; it might be new ideas sparked by using the software; it might be a reshuffling of priorities, or it might be a clarification of some miscommunication between the users and the developers.  It will probably be several of these things. 
That miscommunication issue is one that occurs far too often on software projects.  Catching these misunderstandings early in the life of an application can save a huge amount of time and money.  We all know that the cost of making a change to software goes up exponentially the later that change is made.

By delivering something useable to the customer several times a month, we are providing value to the customer in a timely manner.  At best, this value comes in the form of software that enhances their ability to perform their job.  At worst, we provide something they didn't ask for.  But this worst-case scenario also adds value because we can use the delivery to clarify the misunderstandings and poor assumptions that leaked through the design.

I think back to the last waterfall project in which I was involved.  Our team was charged with designing and building an integration layer between an e-commerce web application (that was being designed at the same time) and dozens of backend systems (that were in a state of constant flux).  We spent months designing this integration layer.  During these months, the systems with which we planned to integrate changed dozens of times.  These changes included adding or removing fields; placing a web service in front of an existing interface; and completely redesigning and rewriting backend systems.

Each of these changes forced us to re-examine all the design work we had done and to modify all our documents to match the changed requirements.  In some cases, we had to start our design over from scratch.

An agile approach would have helped immensely.  Instead of designing everything completely before we started building anything, we could have minimized changes by designing, building, and deploying the integration service to one back-end system at a time.  By selecting a single integration point, we might have been able to quickly deliver a single piece of functionality while other backend systems to stabilize. 


I'm not going to suggest that agile is the appropriate methodology for every software project or that no other methodologies have value.  My former employer delivered countless successful projects using waterfall techniques. 

But it pays to recognize when agile will help your project and it is definitely a useful tool for any developer, architect or project manager to have in his or her toolbox.

Friday, August 8, 2008 3:58:27 PM (GMT Daylight Time, UTC+01:00)
# Thursday, August 7, 2008

Microsoft recently released the Managed Extensibility Framework (MEF) which allows developers to add hooks into their applications so that the application can be extended at runtime.

Using MEF is a 2-step process: one step is performed by the application developer who adds attributes or code at defined points in the application.  At these points, the application searches for extensible objects and adds or call them to the application at runtime. 
The second step is by third-party developers who use the MEF application programming interface (API) to define classes in an “extension” assembly as extensible so that they will be discoverable by the above-mentioned applications.

The two steps are loosely-coupled, meaning neither the application nor the extension assembly needs to know anything about the other.  We don't even need to set a reference from one project to another in order to call across these boundaries.

I can think of two scenarios where this technology would be useful.

In scenario 1, an independent software vendor develops and sells a package with many pluggable modules.  Customers may choose to buy and install one module or all modules in the package.  For example, an Accounting package may offer General Ledger, Accounts Payable, Accounts Receivable Payroll and Reporting modules, but not all users will want to pay for every module.  By using MEF, the software could search a well-known directory for any module assemblies (flagged as extensions by MEF) and add to the menu only those that are installed.  With MEF in place, more modules could be added at a later time with no recompiling.

In scenario 2, developers create and deploy an application with a given set of functionality and create points at which other developers are allowed to extend the application using MEF.  By publishing these extendable points, they can allow developers to add functionality to the application without modifying or overriding the original source code.  This is a much safer way of extending functionality.  Extensions could be anything from new business rules or workflow to additional UI elements on forms.

All the extensions happen at runtime and MEF gives developers the ability to add metadata to better describe their extension classes.  By querying this metadata, we can conditionally load only those extensions that meet expected criteria.  The best part of this feature is that we can query a class's metadata without actually loading that class into memory.  This can be a huge resource saving over similar methods, such as Reflection.

MEF is currently released as Community Technology Preview (CPT), so the API is likely to change before its final release.  You can download the CTP and read more about it at http://code.msdn.microsoft.com/mef.   By learning it now, you can be prepared to add extensibility to your application when MEF is fully released.

Thursday, August 7, 2008 8:37:42 PM (GMT Daylight Time, UTC+01:00)
# Friday, July 4, 2008

July 3 (Contribupendence Day) is almost over.  As promised, I wrote reviews for several colleagues on LinkedIn.  Most of the people I reviewed were those I worked with directly at my last employer.  I am currently on a project that allows me to work directly with a couple folks from my current employer

I became motivated enough that I ended up writing 7 reviews tonight (I had only promised 5) and I invited quite a few new people to connect with me on LinkedIn and Plaxo

I was surprised at the number of past and present colleagues that are using these networking sites.  My connections should at least double in the next few days.

Hopefully the people I recommended will be inspired to pick 5 people of their own and this thing will explode.

My plan is to write more recommendations in the coming weeks.  I have some that I've written but cannot make official until the recipient accepts my connection invitation. 

I won't copy my recommendations on this site but you are welcome to read them here.

Again thanks to Jeff for suggesting this.

Friday, July 4, 2008 3:08:05 AM (GMT Daylight Time, UTC+01:00)
# Thursday, July 3, 2008

Tomorrow - July 3 - is Contribupendence Day.

What is Contribupendence Day, you ask?  Well it's the day when all readers of my blog agree to send me $20.

Not really.  Contribupendence Day is the brainchild of Microsoft Developer Evangelist Jeff Blankenburg, who woke up one day and noticed the untapped potential of networking sites such as LinkedIn, Plaxo and Facebook.  These sites give us the opportunity to recommend or comment on those we have worked.  Unfortunately, few of us take advantage of this feature which is a shame.  These recommendations could be a good source of feedback to potential employers and might make the difference in getting an interview or landing a job.

He's right of course. I've been on LinkedIn for a couple months and I am connected to a few dozen people but these are very passive connections.  In the back of my mind, I tell myself I'll focus on LinkedIn the next time I look for a job (which hopefully won't be for a long time).  The problem with this attitude is that I'm relying on everyone else to motivate themselves around my schedule.  When I'm ready to look for a job, will others have the time to write a glowing review for me?  If I worked for or with them, will they even remember my specific accomplishments?

So, at Jeff's urging, I'm being proactive.  Tomorrow I resolve to write reviews of 5 people on LinkedIn.  In doing so, I hope to inspire these 5 to either review me or to review someone else, which will set in motion a process that may very well come back to me.

I've worked with some great people in my life so it wasn't difficult to pick five that I can rave about.  I'll be reviewing them tomorrow and I urge you to do the same.

Happy Contribupendence Day everybody!  And you are welcome to review me.  Or, if that's too much trouble, just send me the 20 bucks.

Thursday, July 3, 2008 3:24:25 AM (GMT Daylight Time, UTC+01:00)
# Tuesday, July 1, 2008

The ann arbor Give Camp is July 11-13 at Washtenaw Community College - less than two weeks away.  For those who haven't heard, this is a great opportunity to contribute to some worthy causes, flex your tech muscles and network with the developer community. 

At this event, software developers, DBA, project managers and UI designers will get together and develop projects for local charities.  So many charities requested projects for this camp that most had to be turned away due to lack of resources.  The more people involved, the more charities we can help.  The facilities will be available and staffed round-the-clock on this weekend and refreshments will be provided.  For security reasons, you must register in advance in order to participate.  Jennifer Marsman of Microsoft is organizing the camp. 

I will be out of town with my son the weekend of the event, but I volunteered to help with some of the evaluations of the projects because it I really wanted to contribute.

You can get more information and you can register for this great outing at http://www.annarborgivecamp.org/.  If you will be in town, please take a look and consider giving your time.

Tuesday, July 1, 2008 4:49:47 PM (GMT Daylight Time, UTC+01:00)