Tuesday, 25 February 2014

Enabling HTTPS in Azure via worker InstanceInput endpoints

Wow, what a rare and specific thing you might want to do - but we certainly did, and since there was nothing I could find in the Google Brain, I had to figure it out for myself...

In a nutshell, here's how to do it:



  1. Create the endpoint via the role properties in your cloud service project. Since https is not an option, we need to select tcp:

    Also don't be deterred by the fact you can't select an SSL certificate here.
  2. Upload your SSL certificate to your cloud service via the Azure portal, and also make it available via the Certificates tab in your role properties:


  3. Now here's the trick, you need to bind the certificate to the internal port of the role (in this example, 10100). You can do this by running the netsh command as a startup task for the role.

    Create a batch file called bindcertificate.cmd in the root of your worker project. Ensure the file properties are Copy To Output Directory: Copy always.

    Add the following content to the file:

    @echo off

    REM   *** Bind the SSL certificate to the internal input endpoint (appid is irrelevant) ***

    netsh http add sslcert ipport=0.0.0.0:10100 certhash=4F66816E3856A3816246D17A77C62E4C66E641AF appid={00112233-4455-6677-8899-AABBCCDDEEFF} >> "%TEMP%\StartupLog.txt"

    REM   *** Exit batch file. ***

    EXIT /b 0

    Replace the port number with your private port, and the certhash with the thumbprint of your SSL certificate.
  4. Finally, add the batch file as an elevated startup task in your ServiceDefinition.csdef:

    <Startup> <Task commandLine="startuptasks.cmd" executionContext="elevated" taskType="simple" /> </Startup>
  5. In terms of setup, that's it. Now assuming you're using WCF to expose endpoints via the worker role, you can now create a binding with Transport security, e.g.:

    var endpoint = RoleEnvironment.CurrentRoleInstance.InstanceEndpoints["Search"].IPEndpoint.ToString();

    var host = new ServiceHost(yourImplementationHere);

    var binding = new WebHttpBinding(WebHttpSecurityMode.Transport);

    var ep = host.AddServiceEndpoint(
        typeof(IYourImplementation),
        binding,
        string.Format("https://{0}/YourEndpoint", endpoint));

    ep.Behaviors.Add(new WebHttpBehavior());

    host.Open();
You should now be able to access your endpoint externally, using your Azure DNS entry, along with the relevant external port, e.g.:

https://yourcloudservice.cloudapp.net:10106/yourendpoint  (role instance1)
https://yourcloudservice.cloudapp.net:10107/yourendpoint  (role instance 2)

Anthony.

Tuesday, 18 February 2014

Fixing Word Margins in Reporting Services 08

We don't use Report Services with SQL Server, but the Report Viewer control to render reports from Azure table Storage.

That said, I believe the rendering engine is the same in both cases - and in both cases the 08 version renders header & footer content outside of the document margins.

See this Microsoft Connect discussion on the issue, and Microsoft's good 'ol "its by design" response :)

There really is no workaround you can perform in the Reports themselves - as tricking the layout to work in Word will in turn break the layout in other formats like PDF.

So crazy idea time... since a .docx file is really just a .zip fie containing a bunch of xml settings, I thought I could write a function that could modify the header & footer attributes within the file.

Lo and behold it worked really well. It works for us since we just use the Report Viewer control, and not the full-fledged Reporting Services - as we are manually working with document streams etc.

Below is a code snippet of the function. Note that it requires SharpZipLib for the zip handling. We are still on .NET 4.0 but if you are on 4.5 you could swap this out with the new ZipArchive class in the .NET framework.

Anthony.

Tuesday, 21 January 2014

Xero API .NET Client - Implementing your own ITokenRepository

We recently integrated our FoodStorm product with Xero, and we decided to use their .NET API. Most of it is pretty straightforward - except for dealing with access tokens in .NET as their documentation is lacking somewhat.

So here is a bit of a brain-dump of my experiences...

To perform any API operation, you need to first instantiate a Session. In our case we have Partner access, so we need to create a XeroApiPartnerSession. The constructor requires an ITokenRepository (it also requires Certificates - but that's for another blog post!).

So since you can't create an instance of an Interface, you need to create your own class that implements ITokenRepository, and inside that class it's up to you to deal with the loading & saving of access tokens.

When I first saw this, I thought it was overkill - why couldn't I just pass in my access token?!? But I delved more into the API I learned that access tokens expire after 30 minutes - even with Partner access. This means you need to Renew your access token periodically. The good news is that if you implement an ITokenRepository, all this complexity is fully handed by the .NET API, so it's a pretty good design after all.

So onto creating a Token Repository... The ITokenRepository interface looks like this:

As you can probably guess, you need to make the Get methods return your tokens, and your Set methods need to save the tokens. My initial idea was just to Serialize the AccessToken and RequestToken classes to a file - easy peasy - until I realised those classes weren't marked as Serializable...

So that meant I would need to store the values of the properties in the Set methods, and then re-create the classes in the Get methods. And that's what I did:

For simplicity sake, I decided to serialise all the required properties to my own JSON object, as I didn't see the need for specific DB columns etc - but that's up to you. However note that ALL of the properties I am save & loading here are required! If you miss any (which I initially did), things will not work correctly within the API.

Anthony.

Wednesday, 23 February 2011

Running a ClickOnce Application as Administrator

I've been using ClickOnce to deploy our Windows apps for about 12 months now, and it's been an absolutely painless way to handle installation and application updates.
One of the reasons it's so painless is that ClickOnce doesn't require admin privileges to install apps on a machine. However this also means that you cannot launch a ClickOnce app with admin privileges.
Until recently this hasn't been a problem for us, but due to some obscure requirement that 64-bit Windows requires admin privileges to access a 32-bit ODBC data source, this shortfall has reared its ugly head.
The official Microsoft stance is that you cannot use ClickOnce if admin privilges are required, and instead install using Windows Installer or similar. However this really didn't appeal because I'd need implement a new way of handling application updates and have to deal with a whole lot of change.
I came across this article by Charles Engelke, which discusses how he used a ClickOnce app to launch a secondary app as Administrator. This sounded really cool, so I wondered if it would be possible for a ClickOnce app to re-launch itself as administrator. It turns out you can.
Here's how we did it:
Our ClickOnce app is WPF, so it's entry point is via the Application_Startup method App.xaml - however you probably already know the entry point for your app. We added the following:
private bool IsRunAsAdministrator()
{
var wi = WindowsIdentity.GetCurrent();
var wp = new WindowsPrincipal(wi);

return wp.IsInRole(WindowsBuiltInRole.Administrator);
}

private void Application_Startup(object sender, StartupEventArgs e)
{
if (!IsRunAsAdministrator())
{
    // It is not possible to launch a ClickOnce app as administrator directly, so instead we launch the
    // app as administrator in a new process.
    var processInfo = new ProcessStartInfo(Assembly.GetExecutingAssembly().CodeBase);

    // The following properties run the new process as administrator
    processInfo.UseShellExecute = true;
    processInfo.Verb = "runas";
        
    // Start the new process
    try
    {
        Process.Start(processInfo);
    }
    catch (Exception)
    {
        // The user did not allow the application to run as administrator
        MessageBox.Show("Sorry, this application must be run as Administrator.");
    }

    // Shut down the current process
    Application.Current.Shutdown();
}
else
{
    // We are running as administrator
        
    // Do normal startup stuff...
}
}
The idea here is that if the current process is not running as administrator, then launch a new process as administrator, then shut down the current process. The new process will realise it's running as administrator and function as normal.
The method we're using to run as administrator is to set the following properties or ProcessStartInfo:
    // The following properties run the new process as administrator
    processInfo.UseShellExecute = true;
    processInfo.Verb = "runas";
I have a feeling this is not best practice (I believe the "correct" way is to embed a manifest in your application with a requestedExecutionLevel element) - but this way was much easier since the process will only ever be launched from within ClickOnce.

Wednesday, 24 March 2010

Table Storage Backup & Restore for Windows Azure

If you're using Table Storage in Windows Azure you're probably well aware of its real-time replication of data, which for me was a key factor in deciding to use the technology.

That said, I think the ability to perform a traditional database backup or restore (i.e a snapshot of the database) would be a really nice feature - which Table Storage does not currently support. Here are my top reasons why:

  1. Data replication may protect you from disk faults, but it doesn't protect you from accidental or malicious deletion. You'll only get this by taking snapshots of your data and storing it elsewhere.
  2. From a testing perspective, it can be really handy (or sometimes imperative) to "copy back" your production DB to your UAT or development environment.

So I thought I could write my own backup tool that retrieves all data via queries and stores it in a file - and then restore it back again by performing inserts. What started as a small & quick project turned into something much bigger - so I'm releasing it as open source.

Download Table Storage Backup

The project consists of 3 components:

  1. Backup Server. The backup server can be installed in your existing Web Role or Worker Role. The backup server performs all backup & restore operations within the Windows Azure environment.
  2. Backup Client. The backup client provides a friendly way of performing a backup & restore from a Windows PC.
  3. Backup Library. You can use the backup library to implement your own backup system or automate your backup operations, e.g. perform backups on a schedule.

How does it work?

  1. Data is backed up by retrieving all entities from all tables. The maximum number of entities are returned per table service query (until a partition entity is hit or 1000 entities are returned).
  2. Data is restored by performing batch insert operations. The maximum number of entities are inserted per batch (100 entities per partition or 4mb batch size).
  3. All transactions are performed at the raw REST level for efficiency, and to ensure data is duplicated precisely.

Please Contribute!

If you have any questions, feedback or bug reports please post them on the CodePlex site - and if you'd like to work on this project directly please contact me!

Cheers,
Anthony.