Monday, March 16, 2015

CDN fails, but your scripts don’t have to – fallback to local jquery

Bundling with Asp.Net MVC


If you are working on Asp.Net either Web Forms or MVC, you would have come across Bundling & Minification of JavaScript & Css files.

Background:-

There are hardly any applications which don’t use JavaScript or Css files and as our application grows, we tend to split these files logically into multiple files. We then include each file individually on the HTML page which is sent to the browser. The browser then resolves the dependencies on these files by making the request to the server for each file which can slow down the loading of your page.

In .Net 4.5, a powerful tool was launched which allows you to bundle multiple files into one file, which then downloaded by the browser in one single request to the server. Below how it works:-

A typical example of defining a bundle in BundleConfig.cs is given below:-

  bundles.Add(new ScriptBundle("~/bundles/jqueryval").Include(
                        "~/Scripts/jquery.unobtrusive*",
                        "~/Scripts/jquery.validate*",
                        "~/Scripts/requiredif-validation.js"));


The code above just creates a new instance of ScriptBundle and assigns it the virtual path as "~/bundles/jqueryval". It then invokes the include method to specify which files should be included in the bundle.

Now that the bundle is registered, we can reference it in the views using below statement:-

@Scripts.Render("~/bundles/jqueryval")

Now, before you run this, you should know that, by default, in debug mode bundling is switched off to enable developers debug the files if they want to. Although you can override this behavior either from web.config by changing the value of debug to false as shown below:-

<compilation debug="false" targetFramework="4.5" />

Or override these settings by writing below line in BundleConfig.cs:-

BundleTable.EnableOptimizations = true;

Once you do that, next time you run the application, you should see something like below as the file included on your page on browser:-

<script src="https://yoursitename.com/bundles/jqueryval?v=JzhfglzUfmVF2qo-weTo-kvXJ9AJvIRBLmu11PgpbVY1"></script>

Using CDN with bundling:-

In above example, we created a bundle which loads all files from our server but what if you wish to load files from CDN. Files like JQuery are so commonly used that they are available on almost all CDNs & loading these files from CDN can noticeably improve the load time of your web page.

Let’s see how it can be implemented.

First, the declaration of bundle for a CDN changes as shown below:-

var jqueryBundle = new ScriptBundle("~/bundles/jquery", @"https://ajax.ggleapis.com/ajax/libs/jquery/1.10.2/jquery.min.js").Include(
                "~/Scripts/jquery-{version}.js");
bundles.Add(jqueryBundle);

Above we did two things; first we defined the CDN location of the file and then the local version of the same file. In debug mode, the local version will be used and in Release mode, the CDN version will be picked.

The second step is to enable the CDN usage in RegisterBundles method by adding below line:-

bundles.UseCdn = true;

Once this is done, the CDN version will be used in Release mode and the local version will be used in debug mode which obviously as I stated earlier, you can override it.

Now this is all good but whenever you use some third party things in your application, you always have to plan for the failures i.e. in this case let’s say either CDN is down or the file is no longer hosted there. You can't let this break your web page and hence you need a fallback option. Luckily, this is available with bundling in Asp.Net MVC.

Bundling with fallback:-
You can do few extra things in above CDN bundling code to enable fallback which then pick these files from your server in case of CDN failure.

First, add the CdnFallbackExpression in the bundle as mentioned below:-

  var jqueryBundle = new ScriptBundle("~/bundles/jquery", @"https://ajax.ggleapis.com/ajax/libs/jquery/1.10.2/jquery.min.js").Include(
                "~/Scripts/jquery-{version}.js");
jqueryBundle.CdnFallbackExpression = "window.jquery";
bundles.Add(jqueryBundle);

Here you just define a JavaScript method which will be called if it fails to load the file from CDN. You then need to implement this JavaScript method and load the file from your own server as shown below:-

<script>(window.jQuery)||document.write('<script src="/bundles/jquery"><\/script>');</script>

The basic idea for CDN fallback is to check for a type or variable that should be present after a script load, and if it's not there, try getting that script locally.

Once you do this, you are ready to use the bundling with CDN with successful fallback applied and you can be 100% sure that you web page will get the files it needs from one source or another.

Sunday, March 15, 2015

Microsoft Azure vs Amazon Web Services

If you are working with cloud then you must have heard these two names for sure. Microsoft Azure and Amazon Web Services.

If you are evaluating Microsoft Azure vs Amazon Web Services then a great place to start with is last year's Gartner's Quadrant of cloud infrastructure available here. It provides a great background of factors you should consider while deciding on the cloud vendor and who else is available in market apart from these two. 

Although it's a great article and gives you some real insights about cloud world but it lacks to explain some great features of Microsoft Azure like integration capabilities which if you require then Azure is the only option for you. Also, keep in mind that this article and pretty old and doesn't consider the newly available security, networking and integration capabilities of Azure. 

Azure continues to rapidly adopt new features, and it currently offers advanced networking and security functions, including resource level Role Based Access Control, network intrusion detection/prevention system (IDS IPS), and both subnet and interface level network security settings, as well as multiple NICs per virtual machine. 

But the key difference between Microsoft’s offering and Amazon’s is that Azure is built to allow customers to implement a seamless hybrid approach to the cloud in a way that AWS is not.

Below are some specific details on what this difference entails:

  • Azure’s Virtual Machine strategy uses native Hyper-V images for real-time dynamic portability from on-premises to the cloud and from the cloud back to on-premises, while AWS uses a proprietary hypervisor that does not allow images to be moved back to on-premises hardware in this way. 
  • Azure offers hybrid storage solutions that allow you to extend your current storage solutions with cloud storage to increase capacity, provide offsite storage of live data as well as backups and archive data. With the Azure solution, this data is available both on-premises as well as in the Microsoft datacenters, thus offering a host of recovery options.

  • Azure’s development and management tools are the native Windows, SQL and Developer Tools such as Visual Studio, SQL Manager and System Center, allowing native integration of Azure resources with your existing on-premises environments in a single pane of glass.
  • Azure’s platform services are the same offerings that companies have been using on-premises, services such as BizTalk, SQL Server and IIS are all available on-premises, in the cloud as virtual machines and in the cloud as Platform Services

  • Azure’s Active Directory Service provides easy extension of existing Active Directory environments into the cloud to provide federated login to corporate applications as well as thousands of third-party SaaS offerings. Azure also offers companies turn-key password self-service reset capability for on-premises and cloud-based Active Directory through Azure Active Directory
  • Microsoft has a very simple and clear policy for privacy and security and is working to limit government’s ability to blind subpoena corporate data. Microsoft’s Azure Trust Center provides detailed information about certifications, government access and other security topics.

Amazon got a bit of headstart in the cloud market, and that's why they are enjoying an early lead for a while. But Microsoft is committed to investing in their cloud services and they are catching up pretty quick. Remember, Amazon may have a lead but the race has just begun :)


Sunday, March 8, 2015

API Health Monitoring using MEF & Microsoft Azure

Memphis - That's the name I have given to the custom tool I wrote to monitor the health & availability of our API. 

Background:
There are thousands of tools which can be used to monitor the performance, usage, availability of websites but when it comes to APIs, the usual availability test which just ping your server doesn't suffice. Your server hosting the API might be up and running but the methods under the endpoints may not due to backend services or database are down. The only way to identify the working of those methods is by calling them constantly and verifying the response. This is what API monitoring tools offers that you write the test for your endpoint methods and upload them on their websites. Although they are configurable but doesn't give you exactly what you need and if your APIs are as diverse as ours then it really becomes a problem. Also, there is very limited re-usability involved in your tests.

We have our APIs hosted on-Premise & on cloud which are further wrapped under an API management tool like APIGEE. Also, doesn't matter where these are hosted, all are secured via some form of authentication.

Memphis:
The name Memphis came from the underlying framework as the tool is based on the Microsoft's "Managed Extensibility Framework (MEF)" and hosted on Microsoft's cloud - Azure. Below are the components involved in this tool:-
































It may look like a complicated architecture but the components are really straight forward & simple to use. The idea I had while writing this thing was writing & adding tests to this framework should be extremely simple. If its not then people are not going to use it. And since people like copy pasting a lot, adding tests to this framework should be as simple as drag and drop which is why I used MEF. MEF allows you to discover things at runtime like finding DLLs of specific type at some location. Also, there should be minimum code required to write tests which is why "Skeleton" was added. Below are all components explained:-

1. IPlugin: MEF can discover DLLs at runtime in couple of ways but one way is to find all implementations of one specific interface and that is what IPlugin is. You can write a test in a simple class library project inheriting from IPlugin and drop the DLL in Memphis and it will be picked for execution.

2. Skeleton: One problem of simplifying the addition of tests in Memphis is solved via MEF & IPlugin interface but what about ease of writing the tests. This is where Skeleton, an abstract class inheriting from IPlugin comes into picture. Skeleton provides default implementation of most of the method of IPlugin and further allows user to override them. Also, it has helpers for APIGEE, Microsoft Azure which eases the authentication issues. Skeleton also takes cares of logging & error handling and all left to do is just to write your code for calling the method and verify the result.

3. Memphis Executor: Memphis executor is WCF rest service hosted as Azure Website in cloud which holds the job of executing the test written by inheriting Skeleton. It executes all tests found in the DLLs available in Plugins folder of the solution. Each test’s results are held in separate table in Azure Storage. Memphis Executor runs periodically under Azure Scheduler for predefined interval of either 15 or 30 minutes.

4. Memphis Reporter: Memphis reporter is an Azure WebJob running every 5 minutes of interval. The work of reporter is to analyse the test results and report failures, drop of availability & average response under configured limits of each tests. The configuration is defined later. Alerts are sent to configured admin people and additional recipients defined at test level.

5. Memphis Web: Memphis web is web portal hosted in cloud as Azure website. The portal’s dashboard shows the test configured in Memphis executor and details page which shows the performance of test for the selected day plus the configured and calculated availability and average response. The details page also shows the failed tests and errors behind them.

All the 5 components explained above, only Memphis executor is the one which needs to be redeployed every time tests change or added. Memphis web and reporter works independently and picks up all the new things gets added or changed in executor. Everything in Memphis test's is configurable. Below is the configuration per test:-

<monitor name="sectionName" methodName="methodname" serviceName="serviceEndpointName" tableName="azureTableName" additionalRecipient="piyush.gupta@bupa.com" serviceUrl="https://mywcfserviceurl.com/SearchService.svc" disableLogging="false" minTimeBetweenAvailabilityAlerts="120" availabilitySLA="88" disableReporting="true" averageResponse="7" failDuration="35" failCount="2" minTimeBetweenResponseAlerts="120" />

MonitorSection is the name of the main config section which is defined in Memphis Executor’s web.config for it to run the test properly. MonitorSection hold one “monitor” node for each test and each monitor section defines the configuration properties of that test only. The properties of this section are described below:-

Name
Type
Required
Default
value
Description
Name
String
Yes
Null
Name of the section – unique between all tests
MethodName
String
Yes
Null
Name of the method being tested under service
ServiceName
String
Yes
Null
Name of the service whose method is being tested
TableName
String
Yes
Null
Name of the table which will hold the test results
AdditionalRecipients
String
No
Null
Any additional email address who wish to receive notification of service failures along with admin team
ServiceUrl
String
Yes
Null
Url of the service which is being tested
DisableLogging
Bool
No
False
Indicating whether logging of test results is disabled
DisableReporting
Bool
No
False
Indicating whether the reporting of test results via email is disabled
AvailabilitySLA
Int
Yes
99
The expected availability of the service which is used during analysis of test results and email notification if calculated availability goes below this value
AverageResponse
Int
Yes
0.5
Expected average response in seconds which is used during analysis of test results and email notifications are sent if the calculated values goes below this
FailDuration
Int
No
35
Minutes duration under which if failures are encountered above FailCount then Failure alerts are sent
FailCount
Int
No
2
Minimum number of test failures which are encountered under FailDuration then failure alerts are sent
MinTimeBetweenAvailabilityAlerts
Int
No
30
Minutes gap between availability alerts. If the availability is below expected then alerts will be sent every dash minutes as configured here
MinTimeBetweenResponseAlerts
Int
No
30
Minutes gap between average reponse alerts. If average response is below expected then alerts will be sent every dash minutes as configured here


All above configuration properties are per test basis. Each test will have its own set of above configuration which is used by Memphis Executor and reporter for email alerts. The other config settings available in executor and reporter are explained in their respective sections.

This is all about Memphis. It may not be the perfect tool but handle the basic requirement of availability check pretty well.

Hope you would have enjoyed Memphis and if you have any feedback then feel free to suggest.