MVC

by MikeHogg25. June 2012 10:06

 

In my second or third MVC3 project, I have really enjoyed Model Binding and my familiarity with the MVVM pattern in WPF translated well.  Web projects are much more enjoyable in c# for me than they were in the old webforms days.  Additionally, I found features in c# 4.0 that I really like, and frequently find myself taking advantage of nullable parameters, and lambda expressions.  I was asked to replicate legacy code in MVC3, which simply retrieved a feed of xml from a web service and turned them into a searchable data store of links.  Caching was a new feature of this.  The number of links was relatively small enough, that I decided to use file, not database, for the store, and simply creating my data model class with simple .net types allowed me to save my models in Properties.Settings.   In this code sample you will see the model, including a list of IEnumerable Headlines of either type of an enum, since we use this to serve up actually two different pages- InTheNews for public third party publications (WBAL, Women’s World) and NewsReleases (company generated PR)…

 

 

 

namespace SomeBase.Models
{
public enum NewsType { InTheNews, NewsReleases };
public class NewsModel
    {
public string Title { get { return (NewsType == NewsType.InTheNews) ? "In The News" : "News Releases"; } set { var x = value; } }
public NewsType NewsType { get; set; }
public DateTime LastUpdateDate { get; set; }
public IEnumerable<Headline> Headlines { get; set; }
public string SearchString { get; set; }
public string SearchType { get; set; }
public List<System.Web.Mvc.SelectListItem> SearchTypes { get; set; }
        [DisplayFormat(DataFormatString = "{0:" + MH.Web.Mvc3.lib.CONST.DATEFORMATSTRING_SHORT + "}", ApplyFormatInEditMode = true)]
public DateTime? SearchStartDate { get; set; }
        [DisplayFormat(DataFormatString = "{0:" + MH.Web.Mvc3.lib.CONST.DATEFORMATSTRING_SHORT + "}", ApplyFormatInEditMode = true)]
public DateTime? SearchEndDate { get; set; }
    }
public class Headline
    {
        [DisplayFormat(DataFormatString = "{0:" + MH.Web.Mvc3.lib.CONST.DATEFORMATSTRING_SHORT + "}")]
public DateTime PostDate { get; set; }
public HtmlString Link { get; set; }
public NewsType NewsType { get; set; }
public string ArticleType { get; set; }
    }
}

In the Controller, I use an abstract class because this might be served from the Desktop Web Site, or another project, for the Mobile site.  Most of the functionality is in the abstract controller, but the final  controller must implement the Save and Load functions of whatever data store they use, in our case the Properties.Settings.   Here I get to use the new Xml.Linq libraries which make loading an xml document from a url and parsing it one liners…

 

 

private Models.NewsModel GetFilteredNews(Models.NewsModel news)
        {
            Models.NewsModel allnews = GetAllNews(news.NewsType);
            news.SearchTypes = allnews.SearchTypes; // recreate dropdownlist- lost on postback?
            news.Headlines = allnews.Headlines
                .Where(h => h.NewsType == news.NewsType &&
                    (news.SearchEndDate == null || h.PostDate < news.SearchEndDate) &&
                    (news.SearchStartDate == null || h.PostDate > news.SearchStartDate) &&
                    (news.SearchType == null || news.SearchType == h.ArticleType) &&
                    (String.IsNullOrEmpty(news.SearchString) || h.Link.ToString().ToLower().Contains(news.SearchString.ToLower()))
                    ).OrderByDescending(h => h.PostDate);
 return news;
        }
private Models.NewsModel GetAllNews(Models.NewsType newstype)
        {
            Models.NewsModel news = GetSavedNews();
 if (news.LastUpdateDate < DateTime.Now.AddDays(-1))
            {
 // get feed from vocus 
                Models.Headline[] latest = GetVocusFeed();
 if (latest.Count() > 0)  // if vocus is down?
                {
                    news.Headlines = latest;
                    news.LastUpdateDate = DateTime.Now;
                    SaveNews(news);
                }
            }
 // process for the view
            news.NewsType = newstype;
            news.SearchTypes = news.Headlines.Where(h=>h.NewsType == newstype && !String.IsNullOrEmpty(h.ArticleType))
                .OrderBy(h => h.ArticleType).Distinct()
                .Select(h => new System.Web.Mvc.SelectListItem
                {
                    Text = h.ArticleType,
                    Value = h.ArticleType
                }).ToList();
 return news;
        }
private Models.Headline[] GetNewsFeed()
        {
            List<Models.Headline> headlines = new List<Models.Headline>();
            System.Xml.Linq.XDocument collateral = System.Xml.Linq.XDocument.Load(
http://somenewsfeed.com/Custom/CustomXmlFeed.aspx?something=something);
            System.Xml.Linq.XDocument news = System.Xml.Linq.XDocument.Load(
                http://somenewsfeed.com/Custom/CustomXMLFeed.aspx?somethingelse=something);
            headlines.AddRange(news.Descendants(System.Xml.Linq.XName.Get("NewsResults")).Select(e => new Models.Headline
            {
                ArticleType = GetField(e, "News_MediaOutletSortName"),
                Link = GetLink(e, "News"),
                NewsType = Models.NewsType.InTheNews,
                PostDate = DateTime.Parse(GetField(e, "News_NewsDate"))
            }).Where(e => e.Link != null));
            headlines.AddRange(collateral.Descendants(System.Xml.Linq.XName.Get("SomeResults")).Select(e => new Models.Headline
            {
                ArticleType = GetField(e, "A_Name"),
                Link = GetLink(e, "ALink"),
                NewsType = Models.NewsType.NewsReleases,
                PostDate = DateTime.Parse(GetField(e, "PublishDate"))
            }).Where(e => e.Link != null));
 return headlines.ToArray();
        }

 

 

The html page is simple enough, with clean model binding and the html helpers.  Note the handy “If” MvcHtmlString extension and the use of templating for Headlines. 

 

 

<h2 style="margin:30px;">@Html.DisplayFor(m=>m.Title)</h2>
@using (Html.BeginForm())
{
        @Html.ActionLink("Request Customized News", "RequestCustomizedNews").If(Model.NewsType == SomeBase.Models.NewsType.NewsReleases)
<div style="float:left; width:200px; display:block;">
 <h3 >Search Archives</h3>
 <div class="display-label">Headline</div>
 <div class="display-field">
                @Html.EditorFor(model => model.SearchString)
 </div>
 
 <div class="display-label">Type</div>
 <div class="display-field">
                @Html.DropDownListFor(m=>m.SearchType, Model.SearchTypes, "All")
 </div>
 <div class="display-label">From:</div>
 <div class="display-field">
                @Html.TextBoxFor(model => model.SearchStartDate, new { @class = "calendar", @Value=Model.SearchStartDate.HasValue? Model.SearchStartDate.Value.ToString(MH.Web.Mvc3.lib.CONST.DATEFORMATSTRING_SHORT): null })
 </div>
 <div class="display-label">To:</div>
 <div class="display-field">
                @Html.TextBoxFor(model => model.SearchEndDate, new { @class = "calendar" })
 </div>
 <input type="submit" value="Go" />
</div>
<div style="float:left;">
 <h3>Recent Articles</h3>
 <div>
                @Html.DisplayFor(m => m.Headlines)
 </div>
</div>
</div>
}
I am able to serve this one page “News” as two different Urls- InTheNews and NewsReleases, because of the enum, and MVC’s routing mechanism, which allows me to add a custom route, with a custom constraint.  (I have become a DRY fan, and both pages were nearly identical as it was, so I could hardly bear to repeat the code of two Views, two controller Gets, two controller Posts.)  I would make this more dynamic the next time I touched it by iterating through an enum parameter instead of passing hardcoded values but it was a neat exercise nonetheless…

 

 

 

{Global.asax}

 // Abouts
            routes.MapRoute(
                "About", // Route name
                "About/{newstype}", // URL with parameters
 new { controller = "About", action = "News" }, // Parameter defaults
 new { newstype = new OptionalConstraint(new string[]{"InTheNews", "NewsReleases"}) },
 new string[1] { "Some.Controllers" }
            );
public class OptionalConstraint : IRouteConstraint
    {
string[] _mappedactions;
public OptionalConstraint(string[] mappedactions)
        {
            _mappedactions = mappedactions;
        }
public bool Match(System.Web.HttpContextBase httpContext, Route route, string parameterName, RouteValueDictionary values, RouteDirection routeDirection)
        {
 if (routeDirection == RouteDirection.IncomingRequest)
            {
 return values[parameterName] == UrlParameter.Optional || _mappedactions.Contains(values[parameterName].ToString(), StringComparer.OrdinalIgnoreCase);
 
            }
 return false;
        }
    }

Tags:

MVC

No reason not to use automated deployments, with Hudson or Jenkins

by MikeHogg17. June 2012 19:31

 

 

Just discussion not really having to do with Jenkins but related to Building and Deployment of team projects. Feel free to add.

Build and Deploy vs BuildDeploy

You can use batch file xcopy or add other build steps to deploy to an environment as part of your build. Or you can use the Promote plugin and put your deployment in the promote steps rather than the build steps. I use both. Every build automatically copies to a 'dev server' and then Stage deployment is an extra step. Jenkins is so configurable that your imagination is probably the only limit. I like having a dev server so I or devs can see the latest but leaving Stage server alone for Account Services to review, until they want to see the latest version. You can implement Prod as a an additional Promote step, using the Access Control for Approvers, or you can have a separate .ps1 file or .ftp hosts file that the admin manually drops into place before running a (not really) Stage promotion.

Prod Access

You can just use the Promote Approver Access control, and/or you can have the prod ftp hosts set up in System Config, but not in any Project Configs, and the admin being the only person who can change the Project Configs, can go in and make that change to the ftp host in the Project Config, promote/build, and then change it back (actually I've seen enterprise teams do something like this). Or you can just use your ftp client manually, and favorite the jenkins artifacts directory, but then you still have to take your time not to make a mistake ftping the wrong version, which is the whole point of automating deployments.

Library Projects

I used a simple one project web application to try out one particular build and deployment approach for library projects. I wanted to stay true to DRY and leverage a shared library, even without the benefit of automated unit testing on a build server. There are many ways to do similar things, this is just the approach that I'm used to using. My project had a dependency on EuroMVC library. I left this as a separate project in SVN/Jenkins that built its dlls into an SVN directory, and then copied those dlls into my project. Development of Euro MVC library may continue, and my web app could remain unaffected with the same dll version it was tested with. I left it up to the project leader of the web app, if they ever wanted, to go back and get an updated version of the dlls. Visual Studio lets you seamlessly debug into your library code from the dlls, if you also have the symbols and library source (which we have in this scheme).

Database Projects

I have found new to VS 2010 is a DB project that deploys incremental changes to different DB environments. This has made my DB deployments one click also, although I have not yet implemented them in Jenkins command line. It has removed from me the management of so many .sql scripts and the maintenance of those scripts through the development process. I really like it.

Approach: ConfigurationManager, Batch scripts, and Different Web Apps for the same Sln

My approach: I set my MSBuilds to outputdir to a publish directory, sibling to my project .sln file. These are separated into a directory for each environment/.sln configuration. Then I create a post build step to archive artifacts. This copies the whole publish directory off to ../build/archive/BuildNumber/ directories. This does two things- first is that you can retrieve these versions even if the project version goes on and on, you can always go back to the version that was built/on prod last month and revert/ftp that. Second is that Jenkins automatically keeps track of these for Promote steps, so you don't even have to revert manually, you can just re-promote whichever build you like in the list, anytime. Between the msbuild arguments and batch file build steps you should be able to nail down Artifact Publish tailored to each environment you use. They can be time consuming to script at first, but once you nail it down you don't worry about it ever again. I've already been using the Configuration Manager settings for .sln files to get distinct artifact directories. I wonder how hard it would be to set up different configurations for a .sln file to output a Windows service and two distinct web apps for Jenkins to promote. It's probably much the same. And if you can't remove directories from the publish through the file/folder Properties in VS, then a simple batch script step will remove them from the artifact directories.

 

Jenkins can be found here:

http://jenkins-ci.org/ Their wiki and documentation is some of the best I've seen, but I will try and match it here, to our interest.

History/Jenkins vs Hudson

http://en.wikipedia.org/wiki/Jenkins_(software) The short version: Kawaguchi worked for Sun. He authored Hudson as an open source java program and won awards. Sun was bought by Oracle. Oracle had issues with direction of the project. Kawaguchi and the devs renamed the project Jenkins. Oracle continued work on the original trunk, so now we have two branches, the original devs on Jenkins, and Oracle on Hudson. Both projects are still active. Hudson just released a new major version fall 2012, and Jenkins has three minor releases in the first week of 2012 alone.

Installing

I installed using the regular stable windows installer from the first link above. I changed Path To Install to something from root like c:\Jenkins because we are going to refer to the Jenkins Home path a lot.

 

(Requires nothing if you use the installer. The setup.exe will handle installation of .net 2.0 if necessary (to run as service) and the msi bundles JVM (the only real requirement).) After installation, browse to the site. It installs by default on port 8080, although you can change that if you need to in %HOME%/jenkins.xml. First thing I did was check the Enable Security checkbox, and under Security Realm choose the Jenkin's Own User Database option. Make sure under Authorization you have Allow Users To Do Anything selected until the next step.   Save that, then Sign Up. That's it. Proceed to Manage Users step to lock down the access.

In the teams I've been on, devs were set up to do anythign except promote to prod, and one dev was the assigned Admin if there wasn't a Change Control team. Here is my Authorization section in the Configure Hudson screen. Note that Hudson does not have roles, although there is a special token user 'authenticated' you can use. Note that the Promote plugin has its own Access Control, so this is for Build rights, not necessarily Deployment rights. See the note in the bottom of this screen. There is also an option to choose Matrix Access like you see here, but for each project individually.   This could be all you need. An Admin, and allow all auth'd users to read and build. If so, then continue to Configure your First Project

Creating a new project is two steps: Name It, and choose Free Style (or copy from existing project), and click Create to go to the configure screen. The important parts of the configure screen: 1. Set your Source Control to Subversion, and enter your Repo Url. You should get a helpful validation error that prompts you to enter credentials, which will then be saved behind the scenes.   2. Click Add Build Step ( MSBuild ). If You have not yet added MSBuild plugin, then add it to your system configuration. Find your path to the .sln file for this field: Jenkins will download the solution automatically from SVN into something like C:/Jenkins/jobs/JenkinsProjectName/workspace/

Command Line Arguments. This is the most complicated part so far. Depends on your strategy and requirements. For me they usually look like this. Note there are some Jenkins variables you can use here like ${BUILD_TAG} to put each build in its own separate directory. With the new Jenkins I found this unnecessary but the option remains for more complicated scenarios. Here I am also doing a Configuration= for each of my Web.Transforms, and putting each of those into a separate directory, so my workspace structure looks like this: 

 

Jenkins automatically creates all of these directories. All that you decide is the HOME directory C:\Jenkins. Jenkins creates workspace, builds, promotions directories. The Workspace directory is where the SVN source goes and where Jenkins will build the project. The build and promotions directories are mostly just logging (builds is where Jenkins also archives but you don't need to know that). I want to MSBuild to an output directory that I can archive. The publish directory location and artifacts placed in it come from my approach using the MSBuild Configuration parameters. In Hudson, I was doing this manually with bacth scripts and my own directory structures, but Jenkins is more advanced and handles that automatically if we follow a couple conventions. So I put my Publish directory here under Workspace because the Artifact Copy (a later step) root directory is the workspace directory. My MSBuild commandline that works for webapps: /t:Rebuild /p:OutDir=..\publish\Debug\;Configuration=Debug;UseWPP_CopyWebApplication=True;PipelineDependsOnBuild=False Just below Build Steps you see here I add a Post Build Step to Archive the Artifacts. This approach is discussed here

3. Click Save. And That's It! "Where's my prod deployment", you ask? Note the two different builds you added above. That means for each build you run, you will get a directory of artifacts (the Publish) of your project, one transformed for each build step you specify. So when you want to move to prod, just copy from publish/Release for that build number. That means that you can continue committing and building and when an older version passes User Testing, you can copy that specific build version to prod. There is tons more you can do. Move on to the Promote and FTP Plugins for one click deployments.

Promote builds is a way to add a step, after build. This is the way I achieved post-build deployments. Install from the plugins page, and then see this one line checkbox for the Promote section sneak up in the Project Configure screen.

Here you see how I set up Approvers

As you see here, I use this in conjunction with Send Build Artifacts over FTP

Download and install the FTP Plugin from the System Manage Plugins page. Note: There is two. You want the one called specifically "Publish over FTP" Unfortunately, in Hudson, at the time, their FTP plugin was not great, and I settled on a combo xcopy and powershell FTP script so I don't have experience setting up this ftp plugin, but looking at the documentation, it has all the features included that I had to script in the old version. Actually, the new plugin works great. Everything I wished for six months ago. Set your hosts up in System Config:

Then set up your Promote Step in Project Config to use that host. I found these settings worked for my case:

 

 

This was the old way I set up FTP in Hudson, before there existed a plugin. I leave it here as an example of the power of Powershell (plugin): I used two promote actions with my script- first an XCopy

xcopy ..\publish\Stage\hudson-%PROMOTED_JOB_NAME%-%PROMOTED_NUMBER%\_PublishedWebsites c:\Web\DEPLOYED /ICERY
rem this is to setup the powershell script next, because powershell plugin doesn't recognize %PROMOTED_JOB_NAME% etc

Then, the powershell script called with parameters for Stage. Script is attached.

& 'C:\Users\mhogg\.hudson\jobs\CE.Ohio\workspace\Ohio\promote.ps1' "C:\Web\DEPLOYED\Ohio" "switchtoconstellation.discoverydev.com" "switchtoconstellation-d
Param(
	[parameter(Mandatory=$true)]
	[alias("d")]
	$deploymentpath,
	[parameter(Mandatory=$true)]
	[alias("s")]
	$server,
	[parameter(Mandatory=$true)]
	[alias("u")]
	$username,
	[parameter(Mandatory=$true)]
	[alias("p")]
	$password,
	[parameter(Mandatory=$true)]
	[alias("r")]
	$remotepath)
#$deploymentpath = "C:\Web\Deployed\Ohio"
#$server = "switchtoconstellation.discoverydev.com"
#$username = "switchtoconstellation-dev"
#$password = 'w8b%duu#9r'
#$remotepath = "www"
$ftpfile = "temp.ftp"
$currftppwd = $remotepath
function AddItem($path){
    foreach($f in Get-ChildItem($path))
    {
        #Write-Host "testing $f"
if ($f.PSIsContainer -eq $True)
        {
            #Write-Host "recursing $f"
            AddItem($f.PSPath);
        }
else
        {
            $filename = $f.fullname
            #Write-Host "writing $filename to $ftpfile"
            $parentpath = $f.Directory.fullname.Replace($deploymentpath, "")
 if ($currftppwd -ne "\$remotepath$parentpath"){
                AppendFtpCmd("MKDIR \$remotepath$parentpath") 
                AppendFtpCmd("CD \$remotepath$parentpath") 
                $currftppwd = "\$remotepath$parentpath"
            }
            AppendFtpCmd("PUT $filename")
        }
    }
}
# need encoding: .net prepends null char for some reason
function AppendFtpCmd($ftpcmd){
    #$ftpfile = "temp.ftp"
    $ftpcmd | out-file -filepath $ftpfile -encoding "ASCII" -append
} 
"OPEN $server" | out-file -filepath $ftpfile -encoding "ASCII"
AppendFtpCmd("USER $username")
AppendFtpCmd("$password")
AppendFtpCmd("CD $remotepath")
AppendFtpCmd("LCD $deploymentpath")
AddItem("$deploymentpath")
AppendFtpCmd("DISCONNECT")
AppendFtpCmd("BYE")
ftp -n -i -s:$ftpfile

About Mike Hogg

Mike Hogg is a c# developer in Brooklyn.

More Here

Favorite Books

This book had the most influence on my coding style. It drastically changed the way I write code and turned me on to test driven development even if I don't always use it. It made me write clearer, functional-style code using more principles such as DRY, encapsulation, single responsibility, and more.amazon.com

This book opened my eyes to a methodical and systematic approach to upgrading legacy codebases step by step. Incrementally transforming code blocks into testable code before making improvements. amazon.com

More Here