Building Angular 2 App With Web API And .NET Core

21. May 2016 18:45

.NET Core ASP.NET MVC 

Setting up the new Angular 2 app with Web API and .NET core is easy but can be a bit tricky. The older beta releases of Angular 2 works fine as there are not many files to refer and to work with. When I started using Angular 2 it was in RC1 and the way the files are being referenced in the app is bit different than the older versions of Angular 2. I don't want to repeat these steps again and again so I put up a Github repo for this seed project. You clone it and hit F5 and you will have your Angular 2 app with Web API.

At the time of writing this post I am using .NET Core version 1.0.0-preview1-002702. The complete seed project is available on GitHub.
Here is how I did it. Select ASP.NET Core Web Application (.NET Core). I have installed the new .NET Core RC2. Name the project as you like.
Select the Web API project template.
After the project creation is successfull. The first thing is to create a Views folder. The folder structure is like the same like it is for MVC application. This is how the folder structure looks like.
Because its a a Web API project, by default it will not render the views and therefore we have to add some dependencies in project.json file.
"Microsoft.AspNetCore.StaticFiles": "1.0.0-rc2-final",
"Microsoft.AspNetCore.Mvc.TagHelpers": "1.0.0-rc2-final",
"Microsoft.AspNetCore.Mvc.WebApiCompatShim": "1.0.0-rc2-final"
The project.json file is different than that of the older version of the .NET Core. You will see the difference when you see it.
Next we set the routes for the views we have in the Startup.cs file.
app.UseMvc(routes =>
{
    routes.MapRoute("default",
                    "{controller=Home}/{action=Index}/{id?}");

    routes.MapWebApiRoute("defaultApi",
                          "api/{controller}/{id?}");
});
The routes are now set and you can run the application and check if you can see the view or not. Once that is set, let start adding the support for Angular. Here is the list of the files which we need to add.
tsconfig.json file

Add the below code to the file and save it.
{
  "compilerOptions": {
    "target": "es5",
    "module": "system",
    "moduleResolution": "node",
    "sourceMap": true,
    "emitDecoratorMetadata": true,
    "experimentalDecorators": true,
    "removeComments": false,
    "noImplicitAny": false,
    "rootDir": "wwwroot",
    "outDir": "wwwroot",
    "listFiles": true,
    "noLib": false,
    "diagnostics": true
  },
  "exclude": [
    "node_modules"
  ]
}
typings.json file

Add a blank .json file and name it typings.json

Add below lines to the typings.json file.
{
  "ambientDependencies": {
    "es6-shim": "registry:dt/es6-shim#0.31.2+20160317120654",
    "jasmine": "registry:dt/jasmine#2.2.0+20160412134438"
  }
}
As per the official Angular documentation, we will stick with NPM to fulfill the client-side dependencies. Start with adding new npm Configuration File. The content of the NPM file is almost the same, but I made few changes as per my requirement. Here is the complete configuration file.
{
  "name": "Angular2WebAPI-Seed",
  "version": "1.0.0",
  "scripts": {
    "postinstall": "typings install"
  },
  "license": "ISC",
  "dependencies": {
    "@angular/common": "2.0.0-rc.1",
    "@angular/compiler": "2.0.0-rc.1",
    "@angular/core": "2.0.0-rc.1",
    "@angular/http": "2.0.0-rc.1",
    "@angular/platform-browser": "2.0.0-rc.1",
    "@angular/platform-browser-dynamic": "2.0.0-rc.1",
    "@angular/router": "2.0.0-rc.1",
    "@angular/router-deprecated": "2.0.0-rc.1",
    "@angular/upgrade": "2.0.0-rc.1",

    "lodash": "4.12.0",
    "systemjs": "0.19.27",
    "es6-shim": "^0.35.0",
    "reflect-metadata": "^0.1.3",
    "rxjs": "5.0.0-beta.6",
    "zone.js": "^0.6.12",
    "angular2-in-memory-web-api": "0.0.7",
    "bootstrap": "^3.3.6"
  },
  "devDependencies": {
    "gulp": "3.9.1",
    "concurrently": "^2.0.0",
    "typescript": "^1.8.10",
    "typings": "^0.8.1"
  }
}
Notice the postinstall section in the npm configuration file. In this section we are installing the types required by our Angular 2 application. Now we can start setting up the Angular stuff in the wwwroot folder. Create an app and js folder inside this folder. Inside app folder create a new .ts file (TypeScript file). Here is the folder structure looks like in the wwwroot folder.
You can see a system.config.js file is the same as you can see the Angular quickstart guide. I just have map the paths for the dependencies so that can be loaded without any problem. The main.ts file will be the main bootstrapper and app.component.ts file is the component file which will render the content on the page. At this point running the application will fail and it will give you several warnings and errors. To resolve that we need to add the references and we can do this easily by using a gulp file. The gulp file will automate the copying of dependencies in the wwwroot folder and ease our task. 
The content of the gulp file in this case looks like this.
/// <binding BeforeBuild='default' />

var _ = require('lodash');
var gulp = require('gulp');

var js = [
    'node_modules/zone.js/dist/zone.min.js',
    'node_modules/systemjs/dist/system.js',
    'node_modules/reflect-metadata/Reflect.js',
    'node_modules/es6-shim/es6-shim.min.js'
];

var map = [
    'node_modules/es6-shim/es6-shim.map',
    'node_modules/reflect-metadata/reflect.js.map',
    'node_modules/systemjs/dist/system.js.map'
];

var folders = [
    'node_modules/@angular/**/*.*',
    'node_modules/rxjs/**/*.*'
];

gulp.task('copy-js', function () {
    _.forEach(js, function (file, _) {
        gulp.src(file)
       .pipe(gulp.dest('./wwwroot/js'))
    });
});

gulp.task('copy-map', function () {
    _.forEach(map, function (file, _) {
        gulp.src(file)
        .pipe(gulp.dest('./wwwroot/js'))
    });
});

gulp.task('copy-folders', function () {
    _.forEach(folders, function (folder) {
        gulp.src(folder, { base: 'node_modules' })
            .pipe(gulp.dest('./wwwroot/js'))
    });
})

gulp.task('default', ['copy-js', 'copy-map', 'copy-folders']);
Notice the very first line in the file above. We want to execute the task on every time before the build is triggered. The gulp file will only be copying the required files to the js folder and other unnecessary files will not be required. After the task is executed here is the how the final directiry structure will look like under wwwroot.
Depending on the selector you have in your component, you need to add the selector to your view. In my case the selector is app, hence add <app></app> in your Index.cshtml file.

After you set up the index page, there is one more thing that you have to do is to set the launch URL.

The URL may change and differ as per your API endpoint. Press F5 to run the application.

Currently rated 5.0 by 8 people

Parsing Markdown Using Custom TagHelper In ASP.NET MVC 6

30. November 2015 21:01

.NET Core ASP.NET MVC 

Previous versions of MVC allows us to write HtmlHelpers which does a pretty good job then and they are doing it now as well. But in MVC 6, the ASP.NET team has introduced TagHelpers.

Parsing Mardown in .NET is way too simple than one can imagine. Thanks to Stackoverflow's MardownSharp and Karlis Gangis's CommonMark.NET. I use CommonMark.NET as it provides a much faster parsing than other libraries. The blogging platform I use is a custom blogging engine I wrote in MVC4. The post content is saved in HTML which makes my raw HTML way to messy when compared to simple markdown syntax. I have no plans to change the way it is right now, but for the other simple application which is quite similar to notes taking or blogging apps, I would like to save the content in markdown.

I will start with a simple implementation of this custom TagHelper and then then we can look into the other ways to enhance it. Here is how easy it is to create your custom TagHelper.

Create a new class file MarkdownTagHelper.cs. Inside the file, rename the class name to match the file name or you can change the name way you like. In my case I am keeping the class name as same as the file name.

Pay attention to the name of the custom TagHelper. By design, compiler will remove the word TagHelper from the class name and the rest of the name will become your custom TagHelper name.

The next step is to inherit our class with TagHelpers class. Every custom TagHelper will inherit this class. Just like the UserControl class when creating a custom user control. The TagHelper provide us two virtual methods, Process and a ProcessAsync method which we will be going to override to implement our custom logic for our custom markdown TagHelper. The first parameter is the TagHelperContext which holds the information about the current tag and the second parameter is TagHelperOutput object represents the output being generated by the TagHelper. As we need to parse the markdown in our razor view pages, we need to add reference of CommonMark.Net library. Use the below Nuget command to add it to your current project.

Install-Package CommonMark.Net

Till here this is how the code will look like.

So now we have our custom TagHelper that will let us parse the markdown. But to use it in our views we need to opt-in for this TagHelper in the _ViewImports.cshtml file. To enable your custom TagHelper just type in like this:

@addTagHelper "*, WebApplication1"

Your custom tag helper would have been turned purple in color on the view page. It is similar to the line above it where @addTagHelper is importing all the TagHelpers from the given namespace. If you are not interested in opting-in for all the TagHelpers in the given namespace then make use of the @removeTagHelper to disable the TagHelpers you don’t need. For this I want all the tag helpers I have created to be a part of the application and hence the * symbol.

In your view, where you want to use this just type in <markdown> and inside this tag you should have your markdown. To test it, you can view the any raw file in Github and copy the text. I am using README.md from CommonMark.NET and it rendered perfectly.

Caution: When copy-pasting the markdown code from anywhere to your view make sure that you do not have a whitespace in the front of the line. This is only applicable when you are working with the in-line markdown. Here is the screenshot with comparison.

Hit F5 and see the markdown tag helper in action. Below is the output I get.

This is the simplest of all. Now let’s add some prefix to our custom TagHelper. To add a custom tag prefix to the TagHelper, we just need to pay a visit to _ViewImports.cshtml file again and add a new line like so:

@tagHelperPrefix "blog-"

After adding the above line in the file, go to the view page where you have used your custom TagHelper and you can notice that the <markdown> tag isn’t purple anymore. This is because we now have to add a custom tag-prefix that we just defined in the _ViewImports.cshtml file. Change it from <markdown> to <blog-markdown> and it is purple again.

By design, the TagHelper will take <markdown> as a tag to be processed. But it can be easily ignored by using the HtmlTargetElement attribute at the top of the class and allow the use of another name rather than <markdown>. This does not mean that you cannot use <markdown> but instead you can also use the custom TagHelper with the name specify in this attribute.

Now let’s add some attributes to my markdown TagHelper. Let’s try to add a url attribute which will help user to render the markdown on the view from a remote site like Github. To add an attribute, simply add a new public property of type string and call it url. When you create a public property in the TagHelper class it is automatically assumed it is an attribute. To make use of this property, my view now simply say this:

<blog-markdown url="https://raw.githubusercontent.com/Knagis/CommonMark.NET/master/README.md">
</blog-markdown>

The url attribute value is being read by the TagHelper which in turn read the whole string of markdown from Github and render the HTML on the page. Let’s focus again on TargetElement attribute for a while. Consider a scenario where you don’t want your custom TagHelper to render or work if the attributes are not passed or missing. This is where HtmlTargetElement attribute comes into picture. If I don’t want my TagHelper to work if the url parameter is missing then you can simple write your HtmlTargetElement attribute like so:

[HtmlTargetElement("markdown", Attributes = "url")]

Notice the Attributes parameter. The Attributes parameter allows you to set the name of all the attributes which should be processed by your TagHelper or else the TagHelper will not work. For instance, if I just use the <markdown> TagHelper but did not pass the url attribute, the TagHelper will not execute and you will see the raw markdown code. My requirement is to have this TagHelper working with or without the use of url attribute. I can comment out or remove the HtmlTargetElemen attribute or just remove the Attributes parameter to get going.

Here is the complete MarkdownTagHelper.cs:

using Microsoft.AspNet.Razor.Runtime.TagHelpers;
using System;
using System.Net.Http;
using System.Threading.Tasks;

namespace WebApplication1.TagHelpers
{
    //[HtmlTargetElement("markdown", Attributes = "url")]
    public class MarkdownTagHelper : TagHelper
    {
        //Attribute for our custom markdown
        public string Url { get; set; }

        private string parse_content = string.Empty;

        //Stolen from: http://stackoverflow.com/questions/7578857/how-to-check-whether-a-string-is-a-valid-http-url
        private bool isValidURL(string URL)
        {
            Uri uriResult;
            return Uri.TryCreate(URL, UriKind.Absolute, out uriResult)
                && (uriResult.Scheme.ToLowerInvariant() == "http" || uriResult.Scheme.ToLowerInvariant() == "https");
        }

        public override async Task ProcessAsync(TagHelperContext context, TagHelperOutput output)
        {
            if (context.AllAttributes["url"] != null)
            {
                string url = context.AllAttributes["url"].Value.ToString();
                string webContent = string.Empty;
                if (url.Trim().Length > 0)
                {
                    if (isValidURL(url))
                    {
                        using (HttpClient client = new HttpClient())
                        {
                            webContent = await client.GetStringAsync(new Uri(url));
                            parse_content = CommonMark.CommonMarkConverter.Convert(webContent);
                            output.Content.SetHtmlContent(parse_content);
                        }
                    }
                }
            }
            else
            {
                //Gets the content inside the markdown element
                var content = await output.GetChildContentAsync();

                //Read the content as a string and parse it.
                parse_content = CommonMark.CommonMarkConverter.Convert(content.GetContent());

                //Render the parsed markdown inside the tags.
                output.Content.SetHtmlContent(parse_content);
            }
        }
    }
}

I found the full TagHelper feature in MVC 6 a lot more convenient and powerful. I hope you like it as well.

 No Rating

.NET Projects You Should Be Follwing On Github

19. July 2015 04:51

API ASP.NET ASP.NET MVC C# Musings Web 

Open-source has entirely change the programming and developers world. Today you can create any application, game, mobile app without spending a single penny. Thanks to open-source software and awesome community of developers and people behind them. As a .NET developer I have been developing enterprise applications for quite a long time and now I have shifted my focus towards developing products and understanding what it takes to make a successful product launch.

Back then, I used to spend most of my time in investigating the new technologies and what technology we should be using to get this thing done. I still do that today, not because it is the requirement of the project but because I have been asking a lot of questions. The list of projects I have compiled below are the projects that have helped me in learning lots of new things and insights of the programming and I hope this does the same for you as well.  Here is the list of awesome open-source project that you should be following on Github.

Pinta


We all know about Paint.net, it is an awesome tool and a complete replacement of Photoshop (at least for me). And yet there is another project which is almost the same and open-source and it works on Linux and Mac. It uses Gtk# (Gtk sharp) to run on both Window and Linux platforms. This project is a must have if you are a .NET guy and want to get yourself into some serious programming. You will learn about the insights of using gtk# in your projects. Though Microsoft already took the steps to have .NET FX on Linux but still this project is a great learning source.

Official site: http://www.pinta-project.com/

Github: https://github.com/PintaProject/Pinta

 

ShareX


I take a lot of screen shots and record screen casts as well for my personal use. But I used to use two different tools to get the work done. This is one of the tools that will not just take screen shots or just let you record your screen casts easily, it will also allow you to upload them to the 40 different image storing cloud services. Dive into the source code and see the awesomeness under the hood. Here is the project description as seen on Github.

ShareX is an open source program that lets you take screenshots or screencasts of any selected area with a single key, save them in your clipboard, hard disk or instantly upload them to over 40 different file hosting services. In addition to taking screenshots, it can upload images, text files and all other different file types.

Official site: https://getsharex.com/

Github: https://github.com/ShareX/ShareX

 

StackExchange - Data Explorer


You got a programming question, you Google it and it redirects it to StackOverflow. StackOverflow needs no introduction among programmers. StackOverflow is one of the Q&A site dedicated to the developers to get the answers for their problems. But it is just one site. In the recent years StackExchange has grew up and not just providing support for programmers but also helping folks from other fields. Now the data StackExchange has is available for anyone out there for free under creative-commons. If you are interested in looking into the source code that powers the user to query that immense amount of data bank then head over to Github and fork this project. StackExchange is all about Microsoft stack and this tool is also written in ASP.NET MVC3.

Official App: https://data.stackexchange.com/

Github: https://github.com/StackExchange/StackExchange.DataExplorer

 

Mini Blog


This is the minimalistic blog engine written in ASP.NET web pages by the author of BlogEngine.NET, Mads Kristensen. I started my bog with BlogEngine.NET and I had an amazing experience with it. MiniBlog is totally different in terms of features that are offered by BlogEngine.NET. This project will tell you the power of web pages and how you can write your own simple site without wasting much time.

Demo: http://miniblog.azurewebsites.net/ (with user name and password as demo).

Github: https://github.com/madskristensen/MiniBlog

 

Fluent Scheduler

If you want to run cron jobs or automated jobs in your application quietly, then this is the library you should be using. The documentation is pretty sleek and get you started in no time. But other than that you should take a look at the source code and see how nicely this has been done.

Github: https://github.com/jgeurts/FluentScheduler

 

Dapper

A Micro-ORM used by StackExchange sites. This is a perfect replacement for EF. This is just a single file that you can drop in your project and get started.

Dapper is a single file you can drop in to your project that will extend your IDbConnection interface.

Github: https://github.com/StackExchange/dapper-dot-net

 

LINQ-toWIKI

A .NET library to access MediaWiki API. The library is almost 3 years old but the source code will worth the look. Excerpt from Github:

LinqToWiki is a library for accessing sites running MediaWiki (including Wikipedia) through the MediaWiki API from .Net languages like C# and VB.NET.

It can be used to do almost anything that can be done from the web interface and more, including things like editing articles, listing articles in categories, listing all kinds of links on a page and much more. Querying the various lists available can be done using LINQ queries, which then get translated into efficient API requests.

Github: https://github.com/svick/LINQ-to-Wiki

 No Rating

Getting Started With ASP.NET 5 On Ubuntu

16. June 2015 22:59

.NET Framework ASP.NET ASP.NET MVC C# Microsoft Ubuntu Visual Studio Web 

Ever since the .NET stack went open source last year, there is a huge excitement among the developers about the .NET stuff and developing apps using .NET which are no longer limited to Windows platform. I tried to install ASP.NET VNext on Ubuntu VM in which I terribly failed in the first go. Why? because the tutorial I used was quite old and I messed up the installation of pre-requisites. But I get everything working in the second try. So here are the steps and commands that will get you started with ASP.NET VNext on Ubuntu.

I am setting up a fresh VM for development on Ubuntu 14.04.2 LTS

Installing Mono

First thing is to install Mono. For folks who are new to Linux environment, Mono is a community driven project which allows developers to build and run .NET application on Linux platforms. Here is the set of commands that I have to execute to install Mono.

sudo apt-key adv --keyserver keyserver.ubuntu.com --recv-keys 3FA7E0328081BFF6A14DA29AA6A19B38D3D831EF

echo "deb http://download.mono-project.com/repo/debian wheezy main" | sudo tee /etc/apt/sources.list.d/mono-xamarin.list
sudo apt-get update

Install the latest version of Mono available.

sudo apt-get install mono-complete

To check if Mono is successfully installed or to determine the version of Mono on you machine run the below command in the terminal.

mono --version

Installing LibUV

As stated on Github:

Libuv is a multi-platform asynchronous IO library that is used by the KestrelHttpServer that we will use to host our web applications.

Running the below command will install LibUV along with the dependencies require to build it.

sudo apt-get install automake libtool

Getting the source and building and installing it.

curl -sSL https://github.com/libuv/libuv/archive/v1.9.0.tar.gz | sudo tar zxfv - -C /usr/local/src
cd /usr/local/src/libuv-1.9.0
sudo sh autogen.sh
sudo ./configure
sudo make 
sudo make install
sudo rm -rf /usr/local/src/libuv-1.9.0 && cd ~/
sudo ldconfig

Here is a note at Githb repo that explains what the above set of commands are doing.

NOTE: make install puts libuv.so.1 in /usr/local/lib, in the above commands ldconfig is used to update ld.so.cache so that dlopen (see man dlopen) can load it. If you are getting libuv some other way or not running make install then you need to ensure that dlopen is capable of loading libuv.so.1

Getting .NET Version Manager (DNVM)

DNVM is a command line tool which allows you to get new build of the DNX (.NET Execution Environment) and allows you to switch between them. To get DNVM running fire the below command in the terminal.

curl -sSL https://raw.githubusercontent.com/aspnet/Home/dev/dnvminstall.sh | DNX_BRANCH=dev sh && source ~/.dnx/dnvm/dnvm.sh

To check if the DNVM is successfully installed on your machine, type DNVM in the terminal. The output should be something like this:

At any point of time if you want to list out the installed DNX runtimes, run the below command

dnvm list

The next step after this, is to upgrade the DNVM so you can use the dnx and dnu commands. Run the following command in the terminal

dnvm upgrade

Once this is done, we are all set to run ASP.NET VNext application on Ubuntu box. Clone the aspnet/Home repository from Github. If you don't have Git installed then install it with this simple command.

sudo apt-get install git

For simplicity, I have created a new directory on Ubuntu desktop named vnext. You can name the directory as you wish. Navigate to this directory in the terminal and clone the aspnet/Home repository.

git clone https://github.com/aspnet/Home.git

After cloning of repository is done, navigate to the 1.0.0-beta4 directory.

You can see three sample applications that you can test. For this tutorial I am going to checkout HelloMvc application. Get inside the HelloMvc directory and then, run the command 

dnu restore

This will take some time to execute. I didn't face this problem but there is a chance that someone will. When you run this command, the project.json.lock file gets created and the restore of the package will start. In the end when the restore is finalizing, it may say permission is denied. To resolve this error you can change the permission of the folder by running the following command.

sudo chmod -R 755 HelloMvc

You should always change permission to 755 for directories and 644 for files.

After the execution is completed, you can start the server by running the command.

dnx . kestrel

This command will work for both web and mvc application. If you plan to test out the console application then you can run the following command.

dnx . run

The server runs at port 5004. Fire up the browser and type in http://localhost:5004/

Hope this is helpful for the first time users of Linux.

Currently rated 5.0 by 1 person

Recurring Tasks Inside ASP.NET Applications Using HangFire

15. June 2014 22:46

ASP.NET ASP.NET MVC C# SQL Server Web 

This is open-source at its best. Running background task to work in context with ASP.NET was and is still a big deal for few developers. I user QueueUserWorkItem to schedule emails when a new comment is added on my blog. This makes sure that the UI is responsive and the user can close or navigate to other post. I have been working on enterprise applications for many years now and most of the long running tasks are running in the background i.e. windows services.

HangFire is not limited to ASP.NET applications, you can even use it in your console applications.

HangFire is an open-source project which allows us to run recurring tasks within the ASP.NET application. No need of scheduling tasks and windows services. Everything will be within the ASP.NET application. When a new comment is added on my blog, an email is sent to my inbox as a notification to moderate it. In a normal scenario it will take a bit more than a normal time to add a comment because an email is also being sent to my inbox. To overcome this problem, I queued the mail process in the background like so:

bool commentSave = _db.AddComment(comment);
if (commentSave)
{
    System.Threading.ThreadPool.QueueUserWorkItem(s=> BlogEmail.SendEmail(comment));
    return Json(new { message = "Thanks for your comment. The comment is now awaiting moderation" });
}
else
    return Json(new { message = "There is an error while saving comment. Please try again later" });

As soon as a comment is added, user will be prompted that comment is added in the DB but the process of sending the email is scheduled in the background. But this approach has a drawback. What if the email sending is failed? As the admin of my blog, will I be able to see the status of the process? HangFire resolves all these questions and it comes with an awesome HangFire monitor which displays the status of all the background tasks in real-time. I will discuss about the HangFire monitor later in this post, but first let's get started with HangFire.

Installing HangFire

HangFire is available on NuGet. Firing the below command will automatically add the references in your project and takes care of all the configuration.

Install-Package HangFire
Attempting to resolve dependency 'HangFire.SqlServer (= 0.9.1)'.
Attempting to resolve dependency 'HangFire.Core (= 0.9.1)'.
Attempting to resolve dependency 'Common.Logging (= 2.1.2)'.
Attempting to resolve dependency 'Newtonsoft.Json (= 5.0.0)'.
Attempting to resolve dependency 'ncrontab (= 1.0.0)'.
Attempting to resolve dependency 'Dapper (= 1.13)'.
Attempting to resolve dependency 'HangFire.Web (= 0.9.1)'.
Attempting to resolve dependency 'CronExpressionDescriptor (= 1.10.1)'.
Attempting to resolve dependency 'WebActivatorEx (= 2.0.1)'.
Attempting to resolve dependency 'Microsoft.Web.Infrastructure (= 1.0.0.0)'.
Installing 'Common.Logging 2.1.2'.
Successfully installed 'Common.Logging 2.1.2'.
Installing 'ncrontab 1.0.0'.
Successfully installed 'ncrontab 1.0.0'.
Installing 'HangFire.Core 0.9.1'.
Successfully installed 'HangFire.Core 0.9.1'.
Installing 'Dapper 1.13'.
Successfully installed 'Dapper 1.13'.
Installing 'HangFire.SqlServer 0.9.1'.
Successfully installed 'HangFire.SqlServer 0.9.1'.
Installing 'CronExpressionDescriptor 1.10.1'.
Successfully installed 'CronExpressionDescriptor 1.10.1'.
Installing 'WebActivatorEx 2.0.1'.
Successfully installed 'WebActivatorEx 2.0.1'.
Installing 'HangFire.Web 0.9.1'.
Successfully installed 'HangFire.Web 0.9.1'.
Installing 'HangFire 0.9.1'.
Successfully installed 'HangFire 0.9.1'.
Adding 'Common.Logging 2.1.2' to HangfireDemo.
Successfully added 'Common.Logging 2.1.2' to HangfireDemo.
Adding 'ncrontab 1.0.0' to HangfireDemo.
Successfully added 'ncrontab 1.0.0' to HangfireDemo.
Adding 'HangFire.Core 0.9.1' to HangfireDemo.
Successfully added 'HangFire.Core 0.9.1' to HangfireDemo.
Adding 'Dapper 1.13' to HangfireDemo.
Successfully added 'Dapper 1.13' to HangfireDemo.
Adding 'HangFire.SqlServer 0.9.1' to HangfireDemo.
Successfully added 'HangFire.SqlServer 0.9.1' to HangfireDemo.
Adding 'CronExpressionDescriptor 1.10.1' to HangfireDemo.
Successfully added 'CronExpressionDescriptor 1.10.1' to HangfireDemo.
Adding 'WebActivatorEx 2.0.1' to HangfireDemo.
Successfully added 'WebActivatorEx 2.0.1' to HangfireDemo.
Adding 'HangFire.Web 0.9.1' to HangfireDemo.
Successfully added 'HangFire.Web 0.9.1' to HangfireDemo.
Adding 'HangFire 0.9.1' to HangfireDemo.
Successfully added 'HangFire 0.9.1' to HangfireDemo.

I am using HangFire with ASP.NET MVC application. Here are the few things that you need to configure before you dive in. When installing HangFire via NuGet, it adds HangFireConfig.cs under App_Start folder. HangFire supports Redis, SQL Server, SQL Azure or MSMQ. I am using SQL Server in this demo. The reason we require this storage because it is being used by the HangFire monitor to display the real-time data of the jobs. To configure HangFire to use SQL Server, open HangFireConfig.cs file and change the connection string as per your SQL Server installation.

JobStorage.Current = new SqlServerStorage(
    @"Server=GHOST\SERVER; Database=Jobs;user id=sa; password=pass#w0rd1;");

When the application first starts, all required database objects are created. 

You can find the scripts inside the downloaded package HangFire.SqlServer.0.9.1\Tools\install.sql. The jobs and monitor will be using this database to show me the real-time status of the jobs running in the background. To view the HangFire monitor, simply navigate to http://<sitename>/hangfire.axd. As it is a handler, you can see it in your web.config file. Let's see it in action:

The navigation pane on the right, lets you see the jobs and their status. It let's you even see the queues which are currently running. 

Scheduling the Jobs

Scheduling jobs using HangFire is easier then I thought it would be. Talking about the same example from my blog which sends email in my inbox when a new comment is added. If I want to schedule the mail send process as a background job I can do it easily using the BackgroundJob class.

bool commentSave = _db.AddComment(comment);
if (commentSave)
{
    BackgroundJob.Enqueue(() => BlogEmail.SendEmail(comment));
    return Json(new { message = "Thanks for your comment. The comment is now awaiting moderation" });
}
else
    return Json(new { message = "There is an error while saving comment. Please try again later" });

As I require it to run only once I just have queue it using the BackgroundJob.Enqueue() method. I can also delay the execution of the job using the Schedule method of the BackgroundJob class.

bool commentSave = _db.AddComment(comment);
if (commentSave)
{
    BackgroundJob.Schedule(() => BlogEmail.SendEmail(comment), TimeSpan.FromMinutes(60));
    return Json(new { message = "Thanks for your comment. The comment is now awaiting moderation" });
}
else
    return Json(new { message = "There is an error while saving comment. Please try again later" });

What if the email sending is failed? The SendMail method throws an exception that the mail sending is failed. HangFire will handle this by default and it will retry automatically 3 more times after a consecutive delay after each retry. But if I want to retry it more than 3 times then I can make use of the AutomaticRetry attribute and pass the number of retries I want, something like this:

[AutomaticRetry(Attempts = 5)]
public bool SendEmail(Comment comment)
{
    //Email code
}

Let's say if I do have another method that I want to run every minute (it's an overkill for my blog) then I will make use of RecurringJob class.

RecurringJob.AddOrUpdate(() => Storage.PunchIt(), Cron.Minutely);

Cron enum allows me to schedule a job daily, weekly, monthly, yearly, hourly and minutely. Now as my job is schedule in the background, time to take a look at HangFire monitor.

I have no idea why my Recurring Jobs screen is showing Next and Last execution time as 44 years ago. But you can see the Succeeded Jobs with a minute interval (#5 and #4). HangFire uses persistent storage and therefore you can trigger the job at your will or remove it when you feel like it. That means you configure the job in the code and manage it from the HangFire monitor.

What else you can do with HangFire

I just showed you how easy it can be scheduling jobs using HangFire. But there are more advanced topics which you should be looking into for more complex implementation. HangFire supports logging, dependency injection using Ninject, multiple queue processing and more.

References

Currently rated 3.3 by 6 people