MVC Gators and VS2010

I start with an opening statement like this on a View:

</head>

<body>

    <% using (Html.BeginForm()) { %>

</body>

</html>

 

And then I add a closing gator

<% } %>

 

As soon as I hit enter to the closing gator, VS2010 reformats the code to this:

<body>

    <% using (Html.BeginForm())

       { %>

    <% } %>

</body>

 

Note how the opening gator gets moved down a line.

I think this is better b/c it is more in-line with our Server-side best practices – though it is against most Javascript best practices (and the examples in the book).  I assume there is a setting in VS2010 to not automatically put in the line break…

Also, note to self:

If you write your gator like this:

<%= Html.Label(TempData["message"].ToString()); %>

 

You will get an error message like this:

Remove the semicolon at the end and things work.  The “=” in the Gator means you don’t use a semicolon at the end…

Ugh – They forgot to turn of trace

I Binged Gun Safety (my daughter shot at camp 2 weeks ago and wants to do it again).  The 4th link is from magicyellow – which I assume is the Yellow Pages.  Check out the screen shot of the page’s footer:

 
My favorite SQL was this:

SQL =

INSERT INTO    logs_sc (code,

                                                     event,

                                                     client,

                                                     source,

                                                     sourceSite,

                                                     sourceQuery,

                                                     clientIP,

                                                     admin,

                                                     vid,

                                                     MYCatID,

                                                     locID,

                                                    

                                                     userAgent,

                                                     aid,

                                                    

                                                     siteID)

                                      VALUES (‘SCL’,

                                                     ‘[283868|USNC  ]’,

                                                     ‘[56959191.51529074]’,

                                                     http://www.bing.com/search?q=Cary%2C+NC+Gun+Club&form=QBLH&qs=n&sk=&#8217;,

                                                     http://www.bing.com/search&#8217;,

                                                     ‘q=Cary%2C+NC+Gun+Club&form=QBLH&qs=n&sk=’,

                                                     ‘98.101.143.134’,

                                                     0,

                                                     0,

                                                     283868,

                                                     ‘USNC  ‘,

                                                    

                                                     ‘Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.0; Trident/4.0; SLCC1; .NET CLR 2.0.50727; Media Center PC 5.0; .NET CLR 1.1.4322; .NET CLR 3.5.21022; .NET CLR 1.0.3705; .NET CLR 3.5.30729; MS-RTC EA 2; .NET CLR 3.0.30729; MS-RTC LM 8; .NET4.0C; .NET4.0E;’,

                                                     ‘0’,

                                                    

                                                     1)

 

In case I was wondering what info they were collecting about me in my visit….

 And

Yup – they left all of their trace information on when they want to production.  I emailed them to fix it right away – I wonder how long it will take them to read my email?

 

ASP.NET MVC2 In Action

I started working through ASP MVC2 In Action

 

A couple of nit-pciks

1) They abuse the var keyword and don’t follow best practices in terms of variable naming.  For example, on page 18:

            var model = new GuestBookEntry();

            return View(model)

 

Using var here is just lazy and using the variable “model” leads to confusion– it should be:

            GuestBookEntry guestBookEntry = new GuestBookEntry();

            return View(guestBookEntry);

 

Because that is what the controller is actually returning

2) They don’t fully-qualify the model on the views.  For example, on page 18 again:

<%@ Page Title=""

    Language="C#"

    MasterPageFile="~/Views/Shared/Site.Master"

    Inherits="System.Web.Mvc.ViewPage<GuestBookEntry>" %>

 

It should be

<%@ Page Title=""

    Language="C#"

    MasterPageFile="~/Views/Shared/Site.Master"

    Inherits="System.Web.Mvc.ViewPage<Com.Tff.GuestBook.Models.GuestBookEntry>" %>

 

I don’t know any way to NOT use the namespace using MVC (they must, unless their book code doesn’t compile OR they are putting everything in the same namespace – Ugh!)

I DO like the fact that they put line breaks in the View definition – it is much more readable.

Aside from the nit-picks, I have 1 larger beef.

I believe that having more models (DTOs, POCOs, whatever you want to call them) is better than have 1 uber-model that you shoe horn into all situations.  In fact, having a use-case specific model follows the single-responsibility principle very well.  I DON’T agree with the use of nested classes.  For example, on page 29 they recommend (constructor is mine – I put it in the get the View to match their screen shots)

    public class CustomerSummary

    {

public CustomerSummary(string name, string serviceLevel, string orderCount, string mostRecentOrderDate, bool active)

        {

            this.Name = name;

            this.ServiceLevel = serviceLevel;

            this.OrderCount = orderCount;

            this.MostRecentOrderDate = mostRecentOrderDate;

            this.Input = new CustomerSummaryInput();

            this.Input.Active = active;

        }

 

        public string Name { get; set; }

        public string Active { get; set; }

        public string ServiceLevel { get; set; }

        public string OrderCount { get; set; }

        public string MostRecentOrderDate { get; set; }

        public CustomerSummaryInput Input { get; set; }

 

    }

 

    public class CustomerSummaryInput

    {

        public int Number { get; set; }

        public bool Active { get; set; }

    }

 

Yuck!

There should be a CustomerSummary class and a CustomerInput class.  If you need to link the two – that is what primary keys are for (heck, they start that with the Number property in CustomerSummaryInput.  This nested class combines 2 different things (displaying the customer and getting new customer info).  If there needs to be a customer base class (FirstName, LastName, Active) and then derived classes with the ServiceLevel, etc…, that is much more cleaner.

 

 

Magic Numbers: Lessons Learned

Magic numbers will kill you.  I set up a stored procedure many years ago that included the following TSQL syntax:

HAVING (((tblMeet.SeasonID)=5) AND ((tblMeet.UseForPR)=1) AND ((tblMeet.IsCurrent)=0))

I remembered to change the magic number for a couple of years, but I didn’t change the seasonId at the beginning of this season (now on season 6) so erroneous results were generated (and some kids almost didn’t get a PR ribbon, which would suck).  I changed the syntax to this:

Declare @seasonId int

Set @SeasonId = (

      Select seasonId from tblSeason

      where SeasonDesc = Convert(varchar,YEAR(Getdate())))

and later on:

HAVING (((tblMeet.SeasonID)=@seasonId) AND ((tblMeet.UseForPR)=1) AND ((tblMeet.IsCurrent)=0))

And now I don’t have to remember this change next year.  Dear Jamie2011, you are welcome….

BTW: how cool is the intellisense in SQLServer 2008 Management Studio? Awesome Microsoft. Awesome.

Moving SQL Server between 2 Hosting Companies

I am moving a site from Host4Life to WinHost.  There were 2 high-level tasks:

Publish the website

Move the database

 

Publishing the website was easy.   All I had to do was to change the ftp address in Visual Studio 2010 Publish Website wizard

Moving the database was more complicated.  I tried 3 different ways using SQL Server Management Studio 2008 (SSMS):

1)      Backup/restore.  The backup failed on the default file location.  However, when I backed it  up not specifying the path, the backup worked.  Somewhere on Host4Life’s data servers is a file named Test.  In any event, the fact that I could not back up to my local file system in SSMS killed this idea.  I then tried to look for test but they have a new interface (tinyhost) and my SQL Server password did not work and there was no way I am interacting with those clowns at H4L’s help desk

2)      Copy Database.  I tried this next.  It failed when trying to copy from H4L to WinHost and H4L to my local file system.  I went through a couple of iterations with my local system – I made more progress once I enabled all of the SQL Server Services, but it ultimately failed and the log did not tell me why easily

3)      Scripting.  Ultimately, this is what worked for me.  I don’t know what version you can add scripting the data (shown here) but that made all the difference.  I scripted the tables, functions, views, and stored procs (fortunately, I don’t have any recursive dependencies in my objects) in order and the database came up as expected

POCOs

I started working through the POCO Template for Entity Framework found here.  I got through the exercise and have a pretty good idea of the what (if not the why) of POCOs.  I ran into a couple of gotchas:

 

If you spell

string inputFile = @"..\POCOTTemplateWalkthrough\Blogging.edmx";

wrong

You get this error:

Error             1                     Running transformation: System.Reflection.TargetInvocationException: Exception has been thrown by the target of an invocation. —> System.IO.FileNotFoundException: Unable to locate file

   at Microsoft.VisualStudio.TextTemplating.VSHost.TextTemplatingService.ResolvePath(String path)

   at Microsoft.VisualStudio.TextTemplating.VSHost.TextTemplatingService.ResolvePath(String path)

 

Once I got to step #9, I ran it and it worked.  I then created a new Console app to be my User Tier, changed the project type of  POCOTemplateWalkthrough to a Class Library, and moved the Program.cs file to the UI.

To my surprise, it did not compile:

I then had to add a reference to Entities (I thought my UI could be Entity-unaware?) to get it compile and run.  It compiled but when I ran it, I got the following error:

{"Violation of PRIMARY KEY constraint ‘PK_People’. Cannot insert duplicate key in object ‘dbo.People’.\r\nThe statement has been terminated."}

I then deleted the 1st record in the data table table to get it to work

 

I ran it again and it worked:

Since I did not specify the ID, it is defaulting to 0 (the tutorial needs to be fixed some).  I then looked at the database table and confirmed that there was no identity key:

I changed it to identity and updated the Entity Framework and POW!

Of course, another way to do it is to add this line of code:

person.ID = 2;

 

But who wants to keep track of Primary Keys on the Data Layer? That is what identity keys are for.  I am a fan of surrogate keys, not a fan of composite keys. 

Mythical Man Month and Software Project Survuval Guide

I finished two books over the last week.  The first is the Mythical Man-Month.  An oldie-but-goodie, it was surprising to me about how many problems that were written about in the last 30 years are still with us.  Of course, they are human problems and it is much easier to change a computer than a person.  Brooks’s hypothesis of Time Versus Number of Workers in a task with complex interrelationships is still spot on – and companies are still re-learning this lesson even today.  The interesting thing is that his solutions are not really relevant – most have been tried, modified, or discarded (the specialization of labor and the parallel documentation has been largely discredited) but the problems of conceptual integrity and communicating outside of the system are still with us.  I chuckled when he wrote “this technical fad of object oriented programming…”.

The other book I read was Software Project Survival Guide by Steve McConnell.  A PMP light read, this book is showing its age.  I enjoyed his discussion on how to treat developers and how to get them to avoid context switching (something my current employer still hasn’t figured out) and the stages in planning.  If I have to lead a project in an overly bureaucratic organization that is behind the curve in terms of industry best practices, I would certainly reach for this book.  If I have the ability to lead a project in an agile and flexible organization, this book would stay on the shelf.

Mocking and .NET Framework Client Profile

I found an interesting small gotcha with Visual Studio today.

I whipped up a quick Hello World Console application and added references for Rhino Mocks and Moq.  However, neither would compile:

I then realized that the Console application, by default, uses the .NET Framework Client profile.  You can read more about it  here – basically it is a stripped down version of the .NET Framework to allow for a smaller footprint on WinForms, Services, and Console applications.  You can see in properities:

 

Apparently, both Mocking frameworks use System.Web, which is not part of the Client profile.  Simply switching the target framework to .Net Framework 4.0 solved the issue.

 

Death March

 

I read Death March over the weekend.  There were some interesting points about  Death March projects:

·         There are a number of good, in fact, necessary reasons to work on Death March project.

·         The fact that the software industry employs many young people who have both the time and energy to work on a Death March project

·         The absolute necessity to get away from the overall bureaucracy when working on a Death March project – esp the methodology police, the physical plant police, etc..

·         The importance of negotiation in the project –esp with the stakeholders

·         The law of diminishing returns – after a certain point the overtime that is put in works to the detriment to the code base and the overall project. 

·         The team is critical – you need people who can and like to work together

·         I liked his suggestions to separate the team: “skunk works”, telecommute, and graveyard shift are all good ways to give the team a physical separation they need from the normal bureaucracy.

·         The most important point of the book: Triage.  All external constraints need to be filtered before they get to the project (including requirements!)   I liked his 80/20 rule – 20 of the “required” functionality will not be delivered, so triage to filter it out early.  The project will still be considered a success.

·         One more interesting point: the author seems to embrace XP (and many of the processes in Agile like daily/continuous builds, etc…) to solve Death March problems.  After reading this book, I would agree with him.

PLINQ – Does it save any time?

Well, yeah!

 

I created a quick Console application  and set up a linear  function call like so:

 

            Stopwatch stopwatch = new Stopwatch();

            stopwatch.Start();

            GetNorthwindCustomers();

            GetAdventureWorkWorkOrders();

            stopwatch.Stop();

            Console.WriteLine(string.Format("Elapsed Time: {0} milliseconds", stopwatch.ElapsedMilliseconds.ToString()));

 

And set up 2 data objects – I used Entity Framework for Northwind and  LINQ to SQL for Adventure works

        static void GetNorthwindCustomers()

        {

            ConsoleColor currentColor = Console.ForegroundColor;

            Console.ForegroundColor = ConsoleColor.Red;

            NorthwindEntities dataContext = new NorthwindEntities();

 

            var customers = (from c in dataContext.Customers

                             select c).OrderBy(c => c.CompanyName);

 

            for (int i = 0; i < 50; i++)

            {

                foreach (Customer customer in customers)

                {

                    Console.WriteLine(string.Format("CustomerID: {0} – CompanyName: {1}", customer.CustomerID, customer.CompanyName));

 

                }

            }

            Console.ForegroundColor = currentColor;

        }

 

And

        static void GetAdventureWorkWorkOrders()

        {

            ConsoleColor currentColor = Console.ForegroundColor;

            Console.ForegroundColor = ConsoleColor.Blue;

            AdventureWorksDataContext dataContext = new AdventureWorksDataContext();

 

            var workOrders = (from wo in dataContext.WorkOrders

                            select wo).OrderBy(wo => wo.StartDate);

 

            foreach (WorkOrder workOrder in workOrders)

            {

                Console.WriteLine(string.Format("WorkOrderId: {0} – StartDate: {1}", workOrder.WorkOrderID, workOrder.StartDate.ToString()));

            }

            Console.ForegroundColor = currentColor;

        }

 

After running it, I get the following result:

 

I then added the AsParallel to the data context like this:

var customers = (from c in dataContext.Customers.AsParallel()

var workOrders = (from wo in dataContext.WorkOrders.AsParallel()

and ran it

The results were:

Two questions came to mind:

·         Why was there a performance gain?

·         The database calls still ran in sequence.  How can I get them to run simultaneously?

I decided to tackle the second question 1st – leaving the 1st as an academic exercise to be completed later.

I realized immediate that the database calls were running in sequence because, well, they were being calling in sequence:

            GetNorthwindCustomers();

            GetAdventureWorkWorkOrders();

 

To get the database calls to run in parallel, I needed to use Parallelism at that level of the call stack.

I first tried to add both function calls to a Task:

            stopwatch.Start();

            Task taskNorthwind = Task.Factory.StartNew(() => GetNorthwindCustomers());

            Task taskAdventureWorks = Task.Factory.StartNew(() => GetAdventureWorkWorkOrders());

            stopwatch.Stop();

 

The funny thing is – the stopwatch ran immediately and the 2 other tasks ran after that.  So I need a way to only running the database calls in parallel, not the stopwatch.

I then tried the Parallel.Invoke method like so:

            Parallel.Invoke(GetNorthwindCustomers);

            Parallel.Invoke(GetAdventureWorkWorkOrders);

 

And I got the same result as a linear call:

So then I realized I should put both tasks in the same Parallel call:

Parallel.Invoke(GetNorthwindCustomers, GetAdventureWorkWorkOrders);

 

Now the different commands in each function are running in parallel – like the console color.  However, the database calls are still operating as a single block.  I then decided to test this hypothesis by putting a Thread.Sleep  in the first  database fetch.

What do you know, the database calls do interleave – if given enough time:

I did notice that no matter what I did with the tasks, the actual sequence from the LINQ was the same (because of the orderby extension method).  That is good news for programmers that want to add parallelism to their existing application and rely on the order from the database (perhaps for the presentation layer).

All in all, it was a very interesting exercise on a Saturday AM.  I ordered

 

I wonder if there are new patterns that I will uncover after reading this.