Portable Class Libraries are a great way to share common code between different platforms. But here lies the danger: A portable class library project cannot change the underlying platform.

Here’s a simple explanation of what that means.

Example project

The example application is built for Windows Forms, Windows RT and Windows Phone. The authentication mechanism is identical in every app so that is placed in a Portable Class Library project:

    public class WebTool
    {
        public async Task<string> GetAuthenticationKey()
        {
            var cookieContainer = new CookieContainer();
            var client = new HttpClient(new HttpClientHandler() { CookieContainer = cookieContainer });

            // Authenticate with the service
            await client.GetStringAsync("http://cookietest.api.domain.com/cookie/get");

            // Return the received cookie as authentication key
            var cookieHeader = cookieContainer.GetCookieHeader(new Uri("http://cookietest.api.domain.com/"));

            return cookieHeader;
        }
    }

Using this library doesn’t get easier: Each platform can call it with identical code:

            var webTool = new WebTool();

            var result = await webTool.GetAuthenticationKey();

            if (string.IsNullOrWhiteSpace(result))
                this.Result.Text = "Didn't receive authentication key!";

            else
                this.Result.Text = "Authentication key: " + result;

Few minutes and everything is done, so the testing phase begins.

First, Windows Forms:

image

Then Windows 8:

image

And last but not least, Windows Phone:

image

Uh oh.

Cookies and CookieContainer especially are different beasts in different platforms. Portable Class Library cannot hide that fact.

Have you encountered similar issues when dealing with Portable Class Library projects?

0 Comments

As the PC makers seem to have some problems getting the Haswell based ultrabooks to market, I decided to get MacBook Air 2013 (MBA 13 i5 model) and use it as my main development machine with the help of Boot Camp. UnfortunatelyI encountered a deal breaker: I couldn’t turn on Hyper-V in Windows 8:

Hyper-V cannot be installed: Virtualization support is disabled in the firmware

Hyper-V is required when one wants to do Windows Phone 8 development, which I do quite much.

But fortunately, there’s a fix for this, though it has couple problems:

  1. It costs money
  2. The solution is just bizarre

In order to enable Hyper-V you have to follow these steps:

  1. Boot the computer into Mac OS X
  2. Purchase (or get the trial), download and install Parallel Desktop for Mac
  3. Reboot back to Windows 8

And that’s it, the Hyper-V is now available.

image

If you turn off the computer at some point, the Hyper-V gets disabled until you boot into Mac OS X and run the Parallel Desktop again. So it’s better just to put the MacBook to sleep.

Bizarre, isn’t it?

I suppose this is just a bug in the firmware as MBA 12 seems to support the Hyper-V just fine. But, until the bug is fixed, this works as a workaround. Thanks to TigerG for posting his findings at MacRUmors.

3 Comments

After my TechDays’ presentation I was asked why we prefer to use Azure Virtual Machines for our web site hosting instead of Azure Web Roles. Here’s a little more structured answer to that question.

Our requirements

Let’s start by going through our requirements for the web site hosting platform. The main things we hope to have are:

  • Ease of deployment
  • Support for running ASP.NET and Node.js sites
  • Support for custom certificates

The problem with Azure Web Sites

Even though the original question didn’t touch Azure Web Sites, I thought about covering that option too. The Azure Web Sites is a great platform but it has a one big glaring problem, making it unsuitable for the production usage: The lack of support for custom certificates.

Other than that, Azure Web Sites offer really fast and easy deployment options and also the Node.js is supported.

Almost there with Azure Web Roles

The Azure Web Roles provide the support for custom certificates and for Node.js, making it ready for production usage. But the platform lacks in the ease of deployment. By default, updating a site is slow and it’s hard to run multiple applications in the same Web Role. The last bit is especially true if one wants to run both Node.js and ASP.NET apps on the same role.

The Azure Web Role platform pushes the developer into the “one app, one service” direction. This causes problems when you have limited resources (money).

Flexibility with Azure Virtual Machines

If you want to combine the Azure Web Site''s’ ease of deployment with Azure Web Roles’ readiness for production usage, the Azure Virtual Machines offer a good solution. Even though it takes some time to set things up, there’s plenty of good documentation available. We’ve setup our web servers using the following tools:

  • DFSR to keep the web servers in sync
  • IIS configured to use “shared configuration” and “Centralized Certificate Store”.
  • Web Deploy enabled on one of the servers (provides the options to “Push” web sites from Visual Studio)
  • IISNode for running Node.js
  • FTP-deploy for static sites and Node.js apps
  • Load Balancing is handled by Azure’s built-in load balancer

Summary

Here’s our requirements and the hosting options collected into a single table.

Feature / PlatformAzure Web SitesAzure Web RolesAzure Virtual Machines
Ease of DeploymentXX
Node.jsXXX
Custom certificatesXX

Other options:

Windows Azure Accelerator for Web Roles

Windows Azure Accelerator for Web Roles deals with the deployment problem of the platform. Unfortunately, the original project is now deprecated. Fortunately, robdmoore’s fork of the project is been actively developed.

The Azure Web Sites Private edition

The tools for running a private Azure Web Sites platform are available from the Web Platform Installer (Windows Azure / Service Management…). It seems that these tools support custom certificates. More info about this solution is available through the Microsoft.com/hosting.

1 Comments

I’ve been processing a lot of messages through Azure Service Bus lately. During that processing I encountered a situation where Azure Management Portal reported that there were over 97 thousand messages in the queue but still the worker weren’t receiving any messages.

image

Turns out the Management Portal also reports messages in the Dead Letter Queue. Some of the messages had failed the processing few times so they had been automatically moved to the Dead Letter Queue.

Here’s a code snippet for creating a QueueClient for the DLQ, from a previously created QueueClient:

var client = QueueClient.CreateFromConnectionString(conn, queue);
...

if (deadLetter)
{
var dfQueue = QueueClient.FormatDeadLetterPath(client.Path);
var dfClient = QueueClient.CreateFromConnectionString(conn, dfQueue);
}

17 Comments

It's hard to schedule tasks on Azure: The platform doesn't have built-in support for executing custom code based on a schedule. There's some options to get around this though, for example:

  • Azure VM and the Windows Task Scheduler
  • Azure Service Bus and ScheduledEnqueueTimeUTC

But these have some limitations which may be hard to get around: In the case of a VM, if it goes down, everything stops. With Azure Service Bus you need some service which handles the scheduling.

This is where the Quartz.net fits in. Quartz.net is a flexible and easy-to-use library for running scheduled tasks. And it works great on Azure.

Here's some highlights of Quartz.net's features:

  • Supports clusters
  • Can use SQL Azure
  • Cron-style syntax for scheduling tasks

In this tutorial we build a Quartz.net (version 2.0.1) cluster using SQL Azure and ASP.NET.

The database

First we need to set-up the SQL Azure database. Quartz.net will persist all the job details and triggers to this database. Easiest way to create the required tables is to use Management Studio to log in to the database and to execute the Quartz.net database schema creation script.

And that's it for the database. Next step is to build the service for running the Quartz.net.

The ASP.NET project

The SQL Azure store is ready and now we need a service which will run actions based on the triggers. The service can be almost anything: a console app, a Windows Service or an ASP.NET project. In this tutorial we're going to use an ASP.NET project which is deployed to two extra small Cloud Service instances.

Here are the steps required for creating the service:

  1. Create new project with template "ASP.NET Empty Web Application"
  2. Use NuGet to install package "Quartz"
  3. Add New Item to project using template "Global Application Class" (global.asax)

Modify the Global.asax.cs so that the Quartz.net starts and stop with the application:

using System;
using Common.Logging;
using Quartz;
using Quartz.Impl;

namespace QuarzApp
{
    public class Global : System.Web.HttpApplication
    {
        public static IScheduler Scheduler;

        protected void Application_Start(object sender, EventArgs e)
        {
            ISchedulerFactory sf = new StdSchedulerFactory();
            Scheduler = sf.GetScheduler();

            Scheduler.Start();
        }

        protected void Application_End(object sender, EventArgs e)
        {
            Scheduler.Shutdown();
        }
    }

}

And finally modify the Web.config to include the Quartz.net configuration:

<?xml version="1.0"?>
<configuration>

  <configSections>
    <section name="quartz" type="System.Configuration.NameValueSectionHandler, System, Version=1.0.5000.0,Culture=neutral, PublicKeyToken=b77a5c561934e089" />
    <sectionGroup name="common">
      <section name="logging" type="Common.Logging.ConfigurationSectionHandler, Common.Logging"/>
    </sectionGroup>
  </configSections>

  <system.web>
    <compilation debug="true" targetFramework="4.5" />
    <httpRuntime targetFramework="4.5" />
  </system.web>

  <common>
    <logging>
      <factoryAdapter type="Common.Logging.Simple.TraceLoggerFactoryAdapter, Common.Logging">
        <arg key="showLogName" value="true"/>
        <arg key="showDataTime" value="true"/>
        <arg key="level" value="INFO"/>
        <arg key="dateTimeFormat" value="HH:mm:ss:fff"/>
      </factoryAdapter>
    </logging>
  </common>

  <quartz>
    <add key="quartz.scheduler.instanceName" value="MyScheduler" />
    <add key="quartz.scheduler.instanceId" value="AUTO" />

    <add key="quartz.threadPool.type" value="Quartz.Simpl.SimpleThreadPool, Quartz" />
    <add key="quartz.threadPool.threadCount" value="5" />
    <add key="quartz.threadPool.threadPriority" value="Normal" />

    <add key="quartz.jobStore.useProperties" value="true" />
    <add key="quartz.jobStore.clustered" value="true" />
    <add key="quartz.jobStore.misfireThreshold" value="60000" />
    <add key="quartz.jobStore.type" value="Quartz.Impl.AdoJobStore.JobStoreTX, Quartz" />
    <add key="quartz.jobStore.tablePrefix" value="QRTZ_" />

    <add key="quartz.jobStore.driverDelegateType" value="Quartz.Impl.AdoJobStore.SqlServerDelegate, Quartz" />

    <add key="quartz.jobStore.dataSource" value="myDS" />

    <add key="quartz.dataSource.myDS.connectionString" value="Server=tcp:myserver.database.windows.net,1433;Database=mydatabase;User ID=myuser@myserver;Password=mypassword;Trusted_Connection=False;Encrypt=True;Connection Timeout=30;" />
    <add key="quartz.dataSource.myDS.provider" value="SqlServer-20" />
  </quartz>

</configuration>

That's it. The service for running the scheduled jobs is ready. As the last step we're going to create a simple job and a schedule for it.

Example job

All the bits and pieces are ready and only thing left is to test out the service. For that we need a job to execute and a trigger which will handle the execution.

The Quartz.net jobs implement a simple IJob interface. The following job will log a message when it is executed:

    public class MyJob : IJob
    {
        private static ILog logging = LogManager.GetLogger(typeof (MyJob));

        public void Execute(IJobExecutionContext context)
        {
            var mydata = context.MergedJobDataMap["data"];

            logging.InfoFormat("Hello from job {0}", mydata);
        }
    }

To execute a job we need a trigger for it. Here's an example code which will add a single trigger for the job, making it to execute after 15 seconds has passed. Create a Web Form "Default.aspx" to the project, add a button to it and in its click handler create a new trigger and add it to the scheduler:

        protected void Button1_Click(object sender, EventArgs e)
        {
            var startTime = DateTimeOffset.Now.AddSeconds(15);

            var job = JobBuilder.Create<MyJob>()
                                .WithIdentity("job1", "group1")
                                .Build();

            var trigger = TriggerBuilder.Create()
                .WithIdentity("trigger1", "group1")
                .StartAt(startTime)
                .Build();

            Global.Scheduler.ScheduleJob(job, trigger);
        }

Deployment and application pools

As the service uses ASP.NET, it can be deployed to Azure Web Sites or to Azure Cloud Services. One thing to keep in mind is that the IIS will recycle the application pools automatically. This will shut down the application and the Quartz.net with it. To get around this limitation, it may be wise to change the application pool's recycling interval to 0.

Another option is to create Quarz.net job which "pings" the current web site once in every 10 minutes or to use some other keep-alive service for the pinging.

Links