Quartz Scheduler

by MikeHogg 26. October 2013 13:46


Quartz Scheduler seems to be a very robust and mature job scheduler originally created in java.  I’ve only just found out about it and find it useful enough to make a note here about it for future use.  You can find lots of tutorials and samples and users online for it.  It is ported to .Net with Quartz.net.  Adding a quartz scheduler to your project is as easy as a Nuget.  Simple scheduling can happen right out of the box.  It can handle complicated cases, and just about all of the scheduling cases that I've seen.  JobDetails, Jobs, and Triggers are all classes you can mix and match up in different combinations.  It also handles persistence if you want to hook up to an ado store (several Sql and NoSql avail), and logging is built in with another Nuget- Common.Logging.

 

If you want to inject services into your IJobs, though, you will need to create your own IJobFactory and start the scheduler that way

Dim schFactory As ISchedulerFactory = _container.Resolve(Of ISchedulerFactory)()
_scheduler = schFactory.GetScheduler()
_scheduler.JobFactory = New ScheduleJobFactory(_container)

Your factory implements NewJob (and with Quartz 2.2, ReturnJob (do nothing unless you have dispose requirements)) and here is where you craete the job with your Container of choice (Autofac here) so the container can inject what it needs...

public IJob NewJob(TriggerFiredBundle bundle, IScheduler scheduler)
        {
            var jobtype = bundle.JobDetail.JobType;
            try
            {
                var schjobtype = typeof(MyJob<>).MakeGenericType(jobtype);
                var schjob = (IJob)Activator.CreateInstance(schjobtype, _container);
                return schjob;
            }
            catch (Exception e)
            {
                using (var l = _container.BeginLifetimeScope())
                {
                    var logger = _container.Resolve<ILogger>();
                    logger.Error(e);
                }
            }
            return new NoOpJob();
        }

... Your JobFactory gets the container from it's constructor

public class ScheduleJobFactory : ISchedulerFactory
    {
        ILifetimeScope _container;
        public ScheduleJobFactory(ILifetimeScope container)
        {
            _container = container;
        } 
    }

I couldn't have figured this part out:  Your jobfactory points to a jobtype class of T that does the container resolve of T (!) clever pattern.

public class MyJob<T> : IJob where T : IJob
    {
        ILifetimeScope _container;
        public MyJob(ILifetimeScope container)
        {
            _container = container;
        }
        public void Execute(IJobExecutionContext context)
        {
            using (var lscope = _container.BeginLifetimeScope())
            {
                var someJob = lscope.Resolve<T>();
                someJob.Execute(context);
            }
        }

 

Your Jobs still inherit from IJob and don't reference the generic job... just register them like normal, and specify these specific classes, in your quartz.jobs.xml

 

...

 

Diagnostics- if you have trouble getting your job xml files to work (or are trying to use deprecated 1.0 version xml with upgraded 2.0 quartz) and want to see the logging, you can add Common.Logging config to your app.config.  The section def:

 

<sectionGroup name="common">
      <section name="logging" type="Common.Logging.ConfigurationSectionHandler, Common.Logging" />
</sectionGroup>

The config-

  <common>
    <logging>
      <factoryAdapter type="Common.Logging.Simple.ConsoleOutLoggerFactoryAdapter, Common.Logging">
        <arg key="level" value="DEBUG" />
        <arg key="showLogName" value="true" />
        <arg key="showDataTime" value="true" />
        <arg key="dateTimeFormat" value="yyyy/MM/dd HH:mm:ss:fff" />
      </factoryAdapter>
    </logging>
  </common>

 

 

I've seen lots of online users hook up their common.logging to log4net to monitor the scheduler process, and there is a FactoryAdapter for log4net if you want to go that route.

 

While we are in config, note that the quartz looks for config in 3 or 4 places.  First in a java style file quartz.config, then in app.config, then in some other places.  If you want to remove the quartz.config and use just .net style, this is a sample app.config section def-

<section name="quartz" type="System.Configuration.NameValueSectionHandler, System, Version=1.0.5000.0,Culture=neutral, PublicKeyToken=b77a5c561934e089" />

 

and config-

  <quartz>
    <add key="quartz.scheduler.instanceName" value="_scheduler" /> <!-- whatfor-->
    <!--     Configure Thread Pool -->
    <add key="quartz.threadPool.type" value="Quartz.Simpl.SimpleThreadPool, Quartz" />
    <add key="quartz.threadPool.threadCount" value="10" />
    <add key="quartz.threadPool.threadPriority" value="Normal" />
    <!--     Configure Job Store --><!--
    <add key="quartz.jobStore.type" value="Quartz.Simpl.RAMJobStore, Quartz" />
      <add key="quartz.plugin.xml.type" value="Quartz.Plugin.Xml.JobInitializationPlugin, Quartz" />-->
    <add key="quartz.plugin.xml.fileNames" value="~/config/ScheduledJobs.config" />
    <!--<add key="quartz.plugin.xml.scanInterval" value="10" />-->
    <add key="quartz.plugin.xml.type" value="Quartz.Plugin.Xml.XMLSchedulingDataProcessorPlugin, Quartz" />
  </quartz>

That uses the defaults.  The XmlSchedulingDataProcessorPlugin I think is the included jobs.xml reader and required if you use xml, as the default it something else.

Tags:

Architecture | C# | VB.Net

NHibernate midstream

by MikeHogg 7. March 2013 14:30

NHibernate is an active ORM product, and has been one for some time.   Coming to it as I have, at version 3 can be difficult to digest, especially as you are soaking up new syntax, even if the patterns are not unfamiliar.  Especially since there are many options and support libraries to choose from, and the core library has changed so much from version one to two and again to three. 

So just blindly googling lines of code and quickstarts can easily get you in lots of dark woods unless you pay attention to the dates of web posts, and understand how NH evolved over time. 

First, the patterns.  NH works from a Repository pattern by using db sessions.  Inject the NH SessionFactory into your repositories and then your Repos can run all the NH Query goodness instead of you writing a whole bunch of db layer stuff.  In its simplest form, you can use an NHHelper class to statically create a SessionFactory inside your Repo, or you can use Construction Injection and let your Injection Library do that for you automatically, like here with Ninject

public class NHibernateSessionFactory
    {
        public ISessionFactory GetSessionFactory()
        { //later }
        public class NHibernateSessionFactoryProvider : Ninject.Activation.Provider<ISessionFactory>
        {
            protected override ISessionFactory CreateInstance(Ninject.Activation.IContext context)
            {
                var sessionFactory = new NHibernateSessionFactory();
                return sessionFactory.GetSessionFactory();
            }
        }

 

then your Repo looks something like this:

 

 public class Repository<T> :  IRepository<T> where T:class
    {
        private readonly ISession session;
 
        public Repository(ISession s)
        {
            session = s;
        }
 

So GetSessionFactory is where all the NH code is config'd.

Here are your options for configuring...

Originally, everything was XML and config based, so you would have the base NH.config() in your code, and your web.config would have a section with the required and optional properties in it.

 

 
public ISessionFactory GetSessionFactory()
        {
            var config = new Configuration();
            config.BuildSessionFactory();
 

 

Then your web.config would have all the hookup:

 

 <configSections>
    <section name="hibernate-configuration" type="NHibernate.Cfg.ConfigurationSectio
nHandler, NHibernate" />
</configSections>
  <connectionStrings>
    <add name="CONN" connectionString="blahblah" providerName="System.Data.SqlClient" />
  </connectionStrings>
  <hibernate-configuration xmlns="urn:nhibernate-configuration-2.2">
    <session-factory>
      <property name="connection.provider">NHibernate.Connection.DriverConnectionProvider</property>
      <property name="connection.driver_class">NHibernate.Driver.SqlClientDriver</property>
      <property name="dialect">NHibernate.Dialect.MsSql2008Dialect</property>
      <property name="connection.connection_string_name">CONN</property>
      <property name="show_sql">true</property>
    </session-factory>
  </hibernate-configuration>
 

Then came Fluent, and config was moved into code. so, if you use that library, you no longer need the web.config and you can just use this in your SessionFactory...

 

        
public ISessionFactory GetSessionFactory()
        {
            // fluent style config.  We use both to apply loquacious mapping by code to old style config, and then create FluentConfiguration with it
            FluentNHibernate.Cfg.FluentConfiguration fluentConfiguration = FluentNHibernate.Cfg.Fluently.Configure()
                                                   .Database(FluentNHibernate.Cfg.Db.MsSqlConfiguration.MsSql2008.ShowSql()
                                                    .ConnectionString(c => c.FromConnectionStringWithKey("CONN")))
                                                    // only need one From Assembly to point to all the maps These are for Fluent MAPS inherting from ClassMap
                                                   .Mappings(m => m.FluentMappings.AddFromAssemblyOf<Maps.EventMap>())                
                                                   .ExposeConfiguration(cfg => cfg.SetProperty("adonet.batch_size", "20"))
                                                   .ExposeConfiguration(c => c.SetProperty("generate_statistics", "true"));                                       
                                                    //.ExposeConfiguration(BuildSchema) 
 
            return fluentConfiguration.BuildSessionFactory();

In addition to config, when setting up your NH Environment, you need to init your maps.  If you already have database tables, the first order of business is to get NMG, NHibernateMappingGenerator.  This reads the db and outputs entity classfiles and classmaps and is a huge help.  Note in the screenshot below, in the upper right hand side of the preferences, how you can choose which format of Mapping files you want.

 

Originally, these would be XML files, (convention) named something.hbm.xml.  We no longer need to do that.  You see also FluentNHibernate and the new Loquacious format.  FNH uses ClassMap parent class, and I believe that might be the convention that allows AddAssemblyFromMaps<>() to work.  Loquacious uses ClassMapping parent, and I believe offers more flexibility in filtering types from Assembly to map.

With NH 3 the project merged the Loquacious (Mapping By Code) codebase that was previously a library, but it is not very mature so a lot of posts I have seen are where it is being used for simpler project requirements or in tandem with a large base of existing Fluent mappings.  I could not figure out a way to add MBC maps to FluentMappings, so in case you ever want to do that here is code to initialize and map both ways, using a base NH.Config with web.config for MBC maps and building your FluentConfig with that base, adding FluentMaps in that process...

 
public ISessionFactory GetSessionFactory()
        {
            
            // old style config 
            var mm = new NHibernate.Mapping.ByCode.ModelMapper();
            Type[] mappingTypes = typeof(Maps.EventMap).Assembly.GetExportedTypes().Where(t => t.Name.EndsWith("Map")).ToArray();
            mm.AddMappings(mappingTypes);
 
            var config = new Configuration(); 
            config.AddMapping(mm.CompileMappingForAllExplicitlyAddedEntities());
 
 
            // fluent style config.  We use both to apply loquacious mapping by code to old style config, and then create FluentConfiguration with it
            FluentNHibernate.Cfg.FluentConfiguration fluentConfiguration = FluentNHibernate.Cfg.Fluently.Configure(config)
                                                   .Database(FluentNHibernate.Cfg.Db.MsSqlConfiguration.MsSql2008.ShowSql()
                                                    .ConnectionString(c => c.FromConnectionStringWithKey("CONN")))
                                                    // only need one From Assembly to point to all the maps These are for Fluent MAPS inherting from ClassMap
                                                   .Mappings(m => m.FluentMappings.AddFromAssemblyOf<Maps.EventMap>())                
                                                   .ExposeConfiguration(cfg => cfg.SetProperty("adonet.batch_size", "20"))
                                                   .ExposeConfiguration(c => c.SetProperty("generate_statistics", "true"));                                       
                                                    //.ExposeConfiguration(BuildSchema) 
 
            return fluentConfiguration.BuildSessionFactory();
        }

Dependency Injection is not a bad word

by MikeHogg 12. December 2012 10:10

"Dependency Injection" is not a bad word.  Besides the fact that it's two words, if anything is bad abou it, it is only that "Dependency" is a bad word,  not Injection.  If you can't encapsulate everything necessary for your class or system to work, then what you want to do is expose your dependencies, in a contract, so other developers can easily figure out what is needed to get what they want from your library.

 

You can do this by limiting constructor signatures, or method signatures, to individual properties that you rely on, or by providing a signature that takes an entire class, and that is what they usually call dependency injection.

 

Let's say you have a custom MembershipProvider, and it provides several public methods, which other developers might find browsing your namespace.  Let's say another developer finds what she is looking for in your dll, and so tries to use it in his case. Let's say you aren't exposing any static methods, so they must now instantiate this object, and any dependencies your classes have become an issue.

Tags:

Architecture

Urls are not strings

by MikeHogg 2. September 2012 09:39

The idea occurred to me recently, that in a web project, if I do a ctrl F for "http://" and find anything that wasn't a third party link, then I might want to take a look at the way I perceive Urls.  There are two main concepts that occur to me at the same time.  The first is that my web project should be able to replicate every feature on every environment (with blue moon exceptions) including on my local machine in debug mode.  So easy stuff like every link should work, and I shouldn't have to worry about clicking any links in special places that might take me to, for instance, the production version of the site.  The Second premise for this, is that I remember reading through some .net framework intellisense, and finding that Uri is a full fledged class (not just another string), so the capability exists to do much more with it, and in web/mvc projects, links seem to me to be more than just strings, they are like... method calls if you will, on controller actions.

 

How would I go about using this idea?  On Application Startup, set a static HOSTNAME, PORT, SCHEME, whatever you intend to do with links, from webconfig (post transform), and then craft your Uris with that, handling specific ports (are you setting Cassini to use the same port on debug? there is a use for that).  the UriBuilder let's you craft every part of the url dynamically, instead of having to use one long hard coded string over and over. Wrap the specific functionality you need in a static class (one or two methods like GetSSLUrl(pathname) and GetNonSSLUrl(pathname)) and you can go back to writing one liners for your links, and even use them in razor templates. There is just so much in the Uri class that you should be able to handle all kinds of situations, with strong typed code, and have everything testable, 'inside the box', meaning write it once and never have to worry about it again.  "Will those links work on that page?"  "Will they work on that server?"  "when I change X, when I turn on SSL, when I send them in an email, when I publish to a new build server?"  Promote your hardcoded links to Uri's and - write it once and never have to worry about it again.

 

Migrating to your UAT host should be painless and all the links will now magically point to httporhttpsScheme://myuathost/:optionalport/urlstringhere.whatever.ext and migrating to your production host should be the same.  Want to set up a third environment to test some branch?  Just add the transforms.

Tags:

Architecture

A More Mature User (model)

by MikeHogg 12. August 2012 09:52

My MVVM and MVC User models have usually been a hierarchy of different classes starting with the simplest name/password and adding more properties with each inherited subclass. 

 

(Side Note: I've noticed a tendency to add more properties to different versions of subclasses can get out of hand, mucking up a membership provider, when mixed with Roles, when what is really called for is a Profile provider for all those descriptive properties.  See Decorate Pattern and MS ProfileProvider.)

 

 

 

This time a client requirement surprised me by making even username/password optional, which I've never done.  As a matter of fact, most of my user hierarchy was built to support the base class of required properties using MVC DataAnnotation Required Attributes.  So

 

I rewrote it as one base User with no requireds, and then various subclasses more like ViewModels.  Also I Interfaced the User and changed all my Membership and Repository arguments to the interface.  Now this pattern seems much more flexible, simple, and easily extensible.  Don't know why I didn't see this before.

 

 

public interface IUserModel
    {
        int Id { get; set; }
 
        string EmailAddress { get; set; }
        string Password { get; set; }  
 
        bool Active { get; set; }
        string FirstName { get; set; }
        string LastName { get; set; }
        string PhoneNumber { get; set; }
        bool GetsEmail { get; set; }
 
        IEnumerable<RoleModel> Roles { get; set; }
    }
    
    [Serializable]
    public class UserModel : IUserModel
    {
        [Key]
        public int Id { get; set; }
 
        [Display(Name = "Email Address")]
        [StringLength(255)]
        [MyLibrary.Web.Mvc3.Attributes.EmailAddress]
        public virtual string EmailAddress { get; set; }
 
        [DataType(DataType.Password)]
        [Display(Name = "Password")]
        public virtual string Password { get; set; }
 
        [Display(Name = "Active")]
        public bool Active { get; set; }
 
        [Display(Name = "First name")]
        [StringLength(50)]
        public virtual string FirstName { get; set; }
 
        [Display(Name = "Last name")]
        [StringLength(50)]
        public string LastName { get; set; }
 
        [StringLength(255)]
        [Display(Name = "Phone Number")]
        public string PhoneNumber { get; set; }
 
        [Display(Name = "Gets Emails?")]
        public bool GetsEmail { get; set; }
 
        [Display(Name = "Roles")]
        public IEnumerable<RoleModel> Roles { get; set; }
    }
 
 
    public class LogOnModel : UserModel
    {
        [Required]
        [Display(Name = "Email Address")]
        [StringLength(255)]
        [MyLibrary.Web.Mvc3.Attributes.EmailAddress]
        public override string EmailAddress { get; set; }
 
        [Required]
        [DataType(DataType.Password)]
        [Display(Name = "Password")]
        public override string Password { get; set; }
 
    }
 
    public class RegisterModel : LogOnModel
    { 
        [Required(ErrorMessage = "Please enter a first name")]
        public override string FirstName { get; set; }
 
        [DataType(DataType.Password)]
        [Compare("Password")]
        [Display(Name = "Confirm Password")]
        public string PasswordConfirm { get; set; }
    }
 
    public class ChangePasswordModel : UserModel
    {
        [Required]
        [DataType(DataType.Password)]
        [Display(Name = "Old Password")]
        public string OldPassword { get; set; }
        
        [Required]
        [DataType(DataType.Password)]
        [Display(Name = "Password")]
        public override string Password { get; set; }
 
        [DataType(DataType.Password)]
        [Compare("Password")]
        [Display(Name = "Confirm Password")]
        public string PasswordConfirm { get; set; }
    }
 
    public class FoundPasswordModel : UserModel
    {
        [Required] 
        public override string EmailAddress { get; set; }
 
        [Required]
        [DataType(DataType.Password)]
        [Display(Name = "Password")]
        public override string Password { get; set; }
 
        [DataType(DataType.Password)]
        [Compare("Password")]
        [Display(Name = "Confirm Password")]
        public string PasswordConfirm { get; set; }
    }
 
    public class ValidatedUserModel : UserModel
    {
        [Required]        [StringLength(255)]
        [MyLibrary.Web.Mvc3.Attributes.EmailAddress]        public override string EmailAddress { get; set; }
 
        [Required(ErrorMessage = "Please enter a first name")]
        public override string FirstName { get; set; }
 
        public ValidatedUserModel() { }
        public ValidatedUserModel(IUserModel user)
        {
            Active = user.Active;
            EmailAddress = user.EmailAddress;
            FirstName = user.FirstName;
            GetsEmail = user.GetsEmail;
            Id = user.Id;
            LastName = user.LastName;
            PhoneNumber = user.PhoneNumber;
            Roles = user.Roles;
        }
    }

Tags:

MVC | Architecture

Static Repositories (vs Instance)

by MikeHogg 30. July 2012 10:10

I find for small projects that I always used to create static repositories.  It can be a speedy mechanism to develop with, and I probably took the lead from MS' Membership classes and their treatment of Membership classes, plus years of working against Oracle databases when no .Net ORM existed that would go against Oracle...

 

I like static because I use them to return typed collections, and they, with linq if necessary, provide easy one liners in most cases where I interface with data access, and don't need to cache or watch my calls for performance, like in Web sites. But, in web cases, note that on each POST or GET, you are recreating all objects anyway, so I would just add predicate calls to static repo.  Here's what many of my repo Getxxx methods look like:

 

 

        internal static IEnumerable<Models.FileModel> GetFileModels(string username = null)
        {
            List<SqlParameter> parms = new List<SqlParameter>();
            parms.Add(new SqlParameter("@EmailAddress", username)); 
 
            IDataReader r = DatabaseHelper.GetDataReader("sp_GetFileModels", parms);
            DataTable t = new DataTable();
            t.Load(r);
 
            var result = from DataRow row in t.Rows
                         select new Models.FileModel
                         {
                             Id = Convert.ToInt32(row["Id"]),
                             RequestDate = DateTime.Parse(row["RequestDate"].ToString()),
                             TitleName = row["TitleName"].ToString(),
                             SomethingId = Convert.ToInt16(row["SomethingId"]),
                             SomethingTypeId = Convert.ToInt16(row["SomethingTypeId"]),
                             FileName = row["Name"].ToString(),
                             FileTypeId =  Convert.ToInt32(row["FileTypeId"]),
                              FileTypeName = row["FileTypeName"].ToString(),
                              ContentEncoding = row["ContentEncoding"].ToString(),
                              ContentLength = Convert.ToInt32(row["ContentLength"]),
                             ActiveFlag = Convert.ToBoolean(row["ActiveFlag"]),
                             EnabledFlag = Convert.ToBoolean(row["EnabledFlag"]),
                             SomeCompanyName = row["SomeCompanyName"].ToString(),
                             SomeNumber = row["SomeNumber"].ToString(),
                             EstimatedNumberOfSomethings = MH.lib.DatabaseHelper.ConvertDBNullsToNInt(row["EstimatedNumberOfSomethings"]),
                             EstimatedDeadline = Convert.IsDBNull(row["EstimatedDeadline"]) ? new DateTime() : DateTime.Parse(row["EstimatedDeadline"].ToString()),
                             FinalDeadline = DateTime.Parse(row["FinalDeadline"].ToString()),
                             ActivityDate = DateTime.Parse(row["ActivityDate"].ToString()),                               
                         }; 
            return result; 
        }

 

 

And then in my business layer I simply Getxxx and if I need a single or a filtered set I simply add a linq predicate.

 

 

model = lib.Repository.GetFileModels(UserNameOrNullIfAdmin()).FirstOrDefault(f =>
                    f.SomethingId == somethingid &&
                    (f.FileTypeId == filetypeid || ( filetypeid == 0 && f.SomethingTypeId == lib.CONST.ACONSTANTID &&  // in case of xyz filetypeid arg is zero
                                                        (f.FileTypeId == lib.CONST.SOMECONSTANTID || f.FileTypeId == lib.CONST.ANOTHERCONSTANTID)))
                    && f.ActiveFlag == true);
 
                if (model == null)
                {
                    Models.Something s = lib.Repository.GetSomethings(UserNameOrNullIfAdmin(), somethingid).FirstOrDefault();
 
                    model = new Models.FileModel
                  {
                      SomethingId = somethingid,
etc., etc.

 

I know this isn't pushing my predicate all the way to the database like Entity Framework so it is something to watch for.  In my experience, most small cases can get to the order of several hundreds of business objects and only require light attention to performance.  And this handles most non-business related applications.

 

But most of the MVC framework codebases and examples I have seen out there that is shared or posted on blogs, happen to use the instance repository pattern... And I hate not knowing Why something is the way it is, so … what are the reasons NOT to use static Repos?

 

I searched for a while, but I only found maybe one informative post, with a couple good answers.

http://stackoverflow.com/questions/5622592/why-arent-data-repositories-static

 

It makes a lot of sense.  Instantiated Repositories can be mocked, and your front layers tested.  That is the number one reason.  I'm not sure that instantiation is a requirement for caching, but it makes sense if you are getting into a site that large.  Second reason – for me this is a big one – is that you can use the inheritance and interfaces to practice DRY and the Repository pattern and not have to write all of your boilerplate data access code in each repository. 

 

The first point speaks to testing, and I haven't seen or heard of any local developers who work at a place that actually promote unit tests.  But I do on occasion in some of my projects, and it helps my writing style.  Food for thought...

Tags:

Architecture | C#

WCF vs MVC REST API

by MikeHogg 28. May 2012 15:25

 

What is this REST API that I keep hearing about?  I have been using WCF for years, but now the new buzzword is REST API for web services.

First, a good background found on this page: http://www.codeproject.com/Articles/255684/Create-and-Consume-RESTFul-Service-in-NET-Framewor

What is REST & RESTful?

Representational State Transfer (REST) is introduced by Roy Fielding on 2000; it is an architectural style of large-scale networked software that takes advantage of the technologies and protocols of the World Wide Web. REST illustrate how concentrated data objects, or resources, can be defined and addressed, stressing the easy exchange of information and scalability.

In 2000, Roy Fielding, one of the primary authors of the HTTP specification, wrote a doctoral dissertation titled Architectural Styles and the Design of Network-based Software Architectures.

REST, an architectural style for building distributed hypermedia driven applications, involves building Resource-Oriented Architecture (ROA) by defining resources that implement uniform interfaces using standard HTTP verbs (GET, POST, PUT, and DELETE), and that can be located/identified by a Uniform Resource Identifier (URI).

REST is not tied to any particular technology or platform – it’s simply a way to design things to work like the Web. People often refer to services that follow this philosophy as “RESTful services.”

My current user case asked for three clients served by one codebase- one WPF client and two web site clients, and so I figured WCF was the best way to go. But I wanted to see what new tech MS has for us...

I saw many examples of REST Controller actions in MVC, but they were using REST architecture, over Http, without typed endpoints and instant Clients from WSDL, whcih was the main reason why WCF would have been so good for my case.  WCF is so mature now that you rarely have to do more than click a few times and add some properties to a project config before you have strong typed client behaviors.  What do I get with this new REST stuff?  A lot of manual work and no strong typed objects.  It sounds like a step backwards to me.

Phil Haack agreed with me...

http://haacked.com/archive/2009/08/17/rest-for-mvc.aspx

"When your service is intended to serve multiple clients (not just your one application) or hit large scale usage, then moving to a real services layer such as WCF may be more appropriate." 

I finally found (the background I linked to above) what I was looking for in the WCF Starter Kit built on 4.0. It has strong typing, and automated client creation. It built REST on top of WCF and added some attributes you could decorate your WCF project with to work over a new protocol WebHttpEndpoint? http://www.codeproject.com/Articles/255684/Create-and-Consume-RESTFul-Service-in-NET-Framewor

This was what I was looking for, but since it built ON TOP of WCF I didn't see the point. To my point, Sam Meacham warned in Sep 2011 not to use WCF REST Starter Kit in the discussion on that page:

http://www.codeproject.com/Articles/255684/Create-and-Consume-RESTFul-Service-in-NET-Framewor?fid=1652761&df=90&mpp=50&noise=3&prof=False&sort=Position&view=Quick&fr=51#xx0xx

"The WCF REST Starter kit is abandoned, and will no longer be developed. WCF was designed to be protocol agnostic. REST services are generally built on the HTTP protocol, using all of the richness of http for your rest semantics. So WCF as it existed was actually a really bad choice for building rest services. You basically had to factor back in all of the http-ness that wcf had just factored out.

Glenn Block at Microsoft, who (with the community) developed the Managed Extensibility Framework (MEF) was reassigned to work on the WCF REST story at MS going forward. They are currently developing the WCF WEB API[^], which will be the new way to create REST services with WCF.

Also, keep in mind that REST has no service description language like WSDL or anything, so things like service location and automatic client generation don't exist. WCF certainly isn't your only chance for creating REST services in .NET. I created the RestCake? library for creating REST services based on IHttpHandler?. Also, IHttpHandler? is a very simple interface for creating REST services. A lot of people prefer to use MVC 3."

So, I conclude WCF is not going away, and is the appropriate tool for this case.  the WCF Web API that I heard rumor about appears to still be in development, coming in MVC4.

I will look at that for a future project but not this one... http://wcf.codeplex.com/wikipage?title=WCF%20HTTP

 

----

PS

Time passed, and I found myself playing with some Android development and wanted to hook up to some WCF service when I found out what is probably one of the big reasons why REST adoption is so strong- Android java libraries don't support SOAP well at all even with third party libraries! 

Tags:

Architecture | REST | WCF

An example of one of my least favorite projects

by MikeHogg 16. May 2012 14:36

One of my least favorite projects where I had control over the outcome was my first WPF project. I had been doing aspnet web apps and winform apps for a few years. I hadn’t really learned a lot about patterns or architecture, but I was exposed to a senior consultant who had a particular effect on me. Under his influence, I started to open my eyes to new technology. I realized that I needed to accelerate my learning or my career was not going to go anywhere.

So among other things, I tried WPF for my next project instead of Winforms. The problem was, that I applied the event driven, static design of Winforms to WPF and it was not built for that.

Once I had invested enough time in the initial design and started to hit my first roadblocks, I realized that WPF was built to work against a pattern called MVVM, and I didn’t want to learn a new pattern on top of a new framework. I kept hitting roadblocks in UI development and each time I found solutions were always in MVVM and so they were not available to me. I ended up writing lots of hacks and disorganized code instead of learning about MVVM.

I delivered in nine months but it was a long nine months. My immediate next opportunity was a small deliverable, and I did that in WPF while learning MVVM, and realized my mistake. I was amazed at how easy it was if I used the correct pattern.  New technologies are as much, if not more, about patterns as they are about the nuts and bolts.

Tags:

Architecture | Me

Logging From Day One (and Exception Handling)

by MikeHogg 9. May 2012 09:50

NLog is so easy to use, it really is like plug and play. Or drag and drop. Add dll to your References. Add this to your web.config, use either file, or db table(what I use). Then, in any class you want to use Logger, just add a line for the static instance:

    public class HomeController : MH.Controllers.AController
    {
        private static NLog.Logger logger = NLog.LogManager.GetCurrentClassLogger(); 

 

 

And then to use it:

 

    logger.Info("Some mess");

 

No reason not to have logging available in every web app from the start. I usually use a Log table described like my web.config shows here


<configuration>
  <configSections>
    <section name="nlog" type="NLog.Config.ConfigSectionHandler, NLog"/>...  </configSections>
...  <nlog xmlns="http://www.nlog-project.org/schemas/NLog.xsd" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" >
    <targets> 
      <target name="db" xsi:type="Database" connectionStringName="CONN"
              commandText="insert into Log(Level, Source, Message, Audit_Date) values(@level, @logger, @message, @time_stamp);">
        <parameter name="@time_stamp" layout="${date}"/>
        <parameter name="@level" layout="${level}"/>
        <parameter name="@logger" layout="${logger}"/>
        <parameter name="@message" layout="${message}"/>
      </target> 
    </targets>
 
    <rules>
      <logger name="*"  writeTo="db"></logger> 
    </rules>
  
  </nlog>

If you can't get it to start working, try using a log file first, or you can add atts like this example:
  <nlog xmlns="http://www.nlog-project.org/schemas/NLog.xsd" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
        internalLogFile="c:\mike.log" internalLogToConsole="true" throwExceptions="true">
    <targets>
      <target xsi:type="File" name="file" fileName="${basedir}/n.log" />

Oh and while we're here, ELMAH is always in my projects even before NLog.  It's just as easy, and actually comes with more features.  I use it with teh DB Table, and automatic emails.  This is all you need to get up and running...

<configuration>
  <configSections>
    <sectionGroup name="elmah">
      <section name="security" requirePermission="false" type="Elmah.SecuritySectionHandler, Elmah" />
      <section name="errorLog" requirePermission="false" type="Elmah.ErrorLogSectionHandler, Elmah" />
      <section name="errorMail" requirePermission="false" type="Elmah.ErrorMailSectionHandler, Elmah" />
      <section name="errorFilter" requirePermission="false" type="Elmah.ErrorFilterSectionHandler, Elmah" />
    </sectionGroup>
  </configSections>...
 
    <httpModules>
      <add name="ErrorLog" type="Elmah.ErrorLogModule, Elmah" />
      <add name="ErrorMail" type="Elmah.ErrorMailModule, Elmah" />
      <add name="ErrorFilter" type="Elmah.ErrorFilterModule, Elmah" />
    </httpModules>
...  <system.webServer>
    <validation validateIntegratedModeConfiguration="false"/>
    <modules runAllManagedModulesForAllRequests="true"> 
        <add name="ErrorLog" type="Elmah.ErrorLogModule, Elmah" preCondition="managedHandler" />
        <add name="ErrorMail" type="Elmah.ErrorMailModule, Elmah" preCondition="managedHandler" />
        <add name="ErrorFilter" type="Elmah.ErrorFilterModule, Elmah" preCondition="managedHandler" />
    </modules> 
  </system.webServer>... and 
  <elmah>
    <!--
        See http://code.google.com/p/elmah/wiki/SecuringErrorLogPages for 
        more information on remote access and securing ELMAH.   -->
    <security allowRemoteAccess="true" />
    <errorLog type="Elmah.SqlErrorLog, Elmah" connectionStringName="CONN"   >
    </errorLog>
    <errorMail
       to="mike.hogg@havasdiscovery.com"
       subject="[ELMAH] ACMT_Web Exception"  >
    </errorMail> 
    
  </elmah>
  <location path="elmah.axd" inheritInChildApplications="false">
    <system.web>
      <httpHandlers>
        <add verb="POST,GET,HEAD" path="elmah.axd" type="Elmah.ErrorLogPageFactory, Elmah" />
      </httpHandlers>
      <!-- 
        See http://code.google.com/p/elmah/wiki/SecuringErrorLogPages for 
        more information on using ASP.NET authorization securing ELMAH.      -->
      <authorization>
        <allow roles="Admin" />
        <deny users="*" />
      </authorization>
    </system.web>
    <system.webServer>
      <handlers>
        <add name="ELMAH" verb="POST,GET,HEAD" path="elmah.axd" type="Elmah.ErrorLogPageFactory, Elmah" preCondition="integratedMode" />
      </handlers>
    </system.webServer>
  </location>
</configuration> 

There's a db script to create the necessaries. I think that's it.  Comes with an Admin Area automatically and a dashboard app, if you set up authorization in your web then you should be able to see it with the Admin role and no further configuration.  ELMAH is good for catching all uncaught exceptions.  It has replaced my standard libraries and error handling methods in global.asax.

 

I also set up my own ErrorController, and some views, for my handled (known) errors.

public class ErrorController : AController
    {
        public ActionResult Index()
        { 
            Models.Error e = GetError();
            e.Title = "Error!";
            e.Message = "We are sorry.  An error has occurred.  Please try again or contact support";
 
            return View(e);
        }
 
        public ActionResult NotFound()
        {
            Models.Error e = GetError();
            e.Title = "Page Could Not Be Found";
            e.Message = "Sorry, that page could not be found";
 
            return View(e);
        }
 
        private Models.Error GetError()
        {
            Models.Error result = new Models.Error();
            Exception ex = null;
 
            try
            {
                ex = (Exception)HttpContext.Application[Request.UserHostAddress.ToString()];
            }
            catch { }
 
            if (ex != null) result.Exception = ex;
            
            return result;
        }

If you want to manually log errors in your app using ELMAH, just do this (wrapped in my lib/logger library):

 

 

public static void LogWebException(Exception ex)
        {
            try
            {
                Elmah.ErrorSignal.FromCurrentContext().Raise(ex, System.Web.HttpContext.Current);

 

Or... add a filter to Exception handling and in that hook tell ELMAH to log handled. Now all of your handled exceptions will be logged also.

namespace MH.Web.Mvc3.Controllers
{
    public class ElmahHandledErrorLoggerFilter : IExceptionFilter
    {
        public void OnException(ExceptionContext context)
        {
            // Log only handled exceptions, because all other will be caught by ELMAH anyway.
            if (context.ExceptionHandled)
                Elmah.ErrorSignal.FromCurrentContext().Raise(context.Exception);
        }
 
        // ADD THIS TO GLOBAL ASAX
        ///public static void RegisterGlobalFilters (GlobalFilterCollection filters)
        //{
        //    filters.Add(new ElmahHandledErrorLoggerFilter());
        //    filters.Add(new HandleErrorAttribute());
        //}
    }
}

 

 

 

ELMAH has a habit of becoming bothersome with all the 404s for robot.txt.   Put this in  your web.config to stop them..

 

 

    <errorFilter>
      <test>
        <or>
          <and>
            <equal binding="HttpStatusCode" value="404" type="Int32" />
            <equal binding="Context.Request.Path" value="/favicon.ico" type="string" />
          </and>
          <and>
            <equal binding="HttpStatusCode" value="404" type="Int32" />
            <equal binding="Context.Request.Path" value="/robots.txt" type="string" />
          </and>
        </or>
      </test>
    </errorFilter>
    
  </elmah>

You can Depend on it

by MikeHogg 12. April 2012 10:09

Imagine for a second, that you wrote an entirely (well mostly) self contained application.  Let's say it has its own home built web server, uses file based persistence, and entirely compiled in one executable.  All it needs is a particular OS and filesystem to run. It doesn't happen like that but we are pretending for a minute. Now, let's say the list of web server features to implement in the next release has grown so long that you don't know how you are going to deliver them all, and someone suggests looking at third party web servers out there- pretend IIS and Apache, or Tomcat. And now you invite IIS into your little ecosystem, for better or for worse, til death do you part(or whenever the rewrite comes along) not only to your language compiler, OS, Html interpretations, and NTFS filesystem, but to this IIS application.

Of course it sounds ludicrous, because we are so used to Having Apache, or IIS. It is a standard requirement.

It's everywhere now and this is good. Your GoDaddy account has two options, Apache or IIS, and nobody thinks twice about it. Same with libraries.

You never think about it but even using .net 1.1 or .net 4.0 is a dependency.

It is something that becomes a responsibility to manage for the life of your application.

 

I used to work on an 8 year old legacy .net web application, created by contractors who had long gone and whose names nobody remembered any longer.  It was a mess, sure- few codebases exist that don't look like messes to anyone but the authors), but it was 8 years old and running just fine.  I worked on this job next to a contractor superstar, one of those famously notorious contractors that sell you the latest and greatest and the moon on top of that and in half the time you wanted, and are not around three months after delivery to answer questions about a behemoth application they delivered held together by shoestrings.

 

Mind you- I learned a ton from this guy.  I used to be partial to old technology, to the point of mistrusting anything new.  Prejudicial, even.  I used to rationalize it as being loyal to the old team... trusting only the tried and true, and being cool, a real nerd who only used obscure old command line tools and found fault with anything that tried to take away my manual control.  In retrospect, I wonder how much of it was just a simple fear of learning new things.  Anyway, this contractor showed me what it was like to be on fire about new technology.  He was always a couple versions ahead in everything.  We would be discussing some new feature that we just found out about in c# 2.0 and he would tell us to just wait and see how fast we get along when we finally get to c# 3.0.  We'd be discussing the upcoming 3.0 framework and he'd be talking about the 3.5 update and the new features in 4.0.  He'd bolt on Application Blocks by the six pack, dotnetnukes and bootstrapper libraries on a whim it seemed.  And he made a lot of money.  He was an independent contractor, made his own hours, drew his own contracts, worked hard, and made a lot of money.  But his answer to any of my questions about the inner workings of some framework call or how to expose a property properly would usually be that I should download and add some application to my codebase.

 

Of course, the way I paint it, it doesn't sound all that good.  And you can guess how the story will end.  But I did learn how to overcome my fear of the new tech.  Learn or be run over by developers that are learning.  But maybe with a bit of moderation.  The story continues... Our team eventually had to upgrade the servers that our project was hosted on from Server 2000 to 2008, and we took the opportunity to upgrade from 1.1 to 3.5.  Along with the one or two library application blocks we depended on from 2002, we had 50 or 60 projects in our web application, and it was fairly painless to upgrade everything, and that application I am fairly confident today, is still chugging along, with new features being added to it even, 11 years old...

 

A couple years later I was asked to get into a project to fix some application that wasn't working.  Turns out it was that contractor's application from the story- Most of the eight or ten feature sets/third party application blocks that composed the total web application had stopped working within the first three or six months of release, long after he had gone, and the users had just not been able to find resources to fix it.

 

Most of the standard dependencies, the web servers, the language frameworks, the operating systems, the filesystems, they all are pretty low maintenance.

I can imagine most applications lasting 8 or 10 years before being absolutely forced to think about these dependencies.

But they are there, and everytime you add one more to the pot, you're adding to that total cost.

You're adding to the time and effort some developer- likely not you- will have to spend pouring through your code five years from now looking for conflicts

or compile errors against a new framework, or hunting down obscure bugs that happened when someone decided to just upgrade and see if it all works.

You're adding to that. There is the possibility that business decisions are made to rewrite the application, or buy a replacement from a vendor,

or spend time/money to fix it. There is a possibility that your pride and joy might die an early death at the first sign of trouble

instead of fading away into the sunset several generations down the road.

So someone explains all the benefits of relational databases, and you find that to be a Good Thing, and add that to your application.

You weigh the odds, make a judgment call, add a third party library to your project Logging.

And another JQuery. Add some new architecture pattern AWS. Add a new framework MVC4.

And now the future is not as stable as before. What is your life expectancy now? What is your confidence of that number?

How will you approach the decision to add the next dependency to your application?

Tags:

Architecture

About Mike Hogg

Mike Hogg is a c# developer in Brooklyn.

More Here

Favorite Books

This book had the most influence on my coding style. It drastically changed the way I write code and turned me on to test driven development even if I don't always use it. It made me write clearer, functional-style code using more principles such as DRY, encapsulation, single responsibility, and more. amazon.com

This book opened my eyes to a methodical and systematic approach to upgrading legacy codebases step by step. Incrementally transforming code blocks into testable code before making improvements. amazon.com

More Here