A good achievement of mine in 2016 was convincing a Friend from Netherlands to use Microsoft Azure to deploy back-end for his mobile app. We had a very long conversation after office hours and he was arguing, instead of Azure, he can host his DB server locally and do the stuff, he had a point, all my arguments were about scaling and enterprise level security but most of the time DEVELOPERS always need to get hands dirty. Also earlier this month another friend of mine wanted me to develop an app for his idea. Developing an app and hosting the app services on Azure is not a big deal but this time I wanted to try things and get hands dirty. Even though my experiences with MySQL and Mongo,  I didn’t try hosting a SQL Server DB before (out side my office work) since “money” but, I thought to try with the new SQL Server on Linux offering, for which I don’t need to care about “money”.

Machine

Didn’t wanted to risk my Acer Aspire P3 tab with Ubuntu 16.04 since many people use it on weekends. But that would be an ideal way of doing it. So decided to create a cheap Ubuntu 16.04 VM on Azure. I have chosen F2 type VM with 4Gigs of RAM residing on East US 2. Its probably the cheapest on since you need 3.5 Gigs of RAM for installing SQL Server on Ubuntu.

azfseriesvms

Installing SQL Server

Its just the matter of executing some commands from SSH. I used WebSSH. But most of the people ( and me also most of the time) use PuTTY. I didn’t install any desktop envs on this machine. Since, Ubuntu executed the following commands (also provided in the documentation)

curl https://packages.microsoft.com/keys/microsoft.asc | sudo apt-key add -

curl https://packages.microsoft.com/config/ubuntu/16.04/mssql-server.list | sudo tee /etc/apt/sources.list.d/mssql-server.list

sudo apt-get update

sudo apt-get install -y mssql-server

 

Done with installation. Now we have to run the configuration script came with the installation where we need to configure SA account, its just about giving password for the account and set the SQL Server service to start on logon. So, execute. Give password for SA and say “Y” to start the service on startup.

sudo /opt/mssql/bin/sqlservr-setup

After this, verify if things went smoothly by executing

systemctl status mssql-server

 

To install SQL Server tools for Linux be executing the following set of commands

curl https://packages.microsoft.com/keys/microsoft.asc | sudo apt-key add -

curl https://packages.microsoft.com/config/ubuntu/16.04/prod.list | sudo tee /etc/apt/sources.list.d/msprod.list

sudo apt-get update 

sudo apt-get install mssql-tools unixodbc-dev

And to create symlinks to ‘SQLCMD’ and ‘BCP’ under /usr/bin/

ln -sfn /opt/mssql-tools/bin/sqlcmd-13.0.1.0 /usr/bin/sqlcmd 

ln -sfn /opt/mssql-tools/bin/bcp-13.0.1.0 /usr/bin/bcp

 

Endpoint Configuration

Go to the ENDPOINTS tab on VM’s windows and click ADD to add MSSQL and leave the default settings (if Public Port is not set to 1433, set it and save).

azvmports

Now you are OK to connect to the SQL Service service hosted on the VM remotely.

Accessing DB Remotely

Before this, we need to create a Database on our newly created service. So go back to SSH and execute the following

sqlcmd -S localhost -U SA -P '[SA_Password]'

CREATE DATABASE quickcustinfo_db;
GO

Now we have created a new database called quickcustinfo_db.

Go to Visual Studio -> Server Explorer -> Connect To Database

connectiondetails

Note that I have selected SQL Server Authentication and gave SA and my password (this is kind of risky business, still for my POC I have decided to go like this, BUT MANIPULATING DATABASE AS “SA” FROM CLIENT APPLICATIONS IS NOT ADVISED). When I populate the combo box to select database name I will be able to see “quickcustinfo_db”. Select it and Press OK.

Go to the properties of this database and copy the connection string. It will look like

Data Source=qcidbserv.cloudapp.net;Initial Catalog=quickcustinfo_db;User ID=SA;Password=***********

Copy and paste it in a notepad and replace *********** with actual password.

ASP Web API2 Application – Web.config

My idea is to use EF. So created a new WebAPI2 application. Here in the Web.config file, we need to give the above connection string by replacing the default one. So we can avoid data to be seeded and manipulated on (localdb) and put them to our remote database hosted on Linux. So I did the changes like below.

  
    <add name="QuickCustInfoContext" connectionString="Data Source=qcidbserv.cloudapp.net;Initial Catalog=quickcustinfo_db;User ID=SA; Password = '[my_sa_pwd]'"
      providerName="System.Data.SqlClient" />
  

Now its all about creating models and seed the data.

Models

For this POC, I created two models

  • Customer.cs
using System;
using System.Collections.Generic;
using System.ComponentModel.DataAnnotations;
using System.Linq;
using System.Web;

namespace QuickCustInfo.Models
{
    public class Customer
    {
        public int Id { get; set; }
        [Required]
        public string Name { get; set; }
    }
}
  • Invoice.cs
using System;
using System.Collections.Generic;
using System.ComponentModel.DataAnnotations;
using System.Linq;
using System.Web;

namespace QuickCustInfo.Models
{
    public class Invoice
    {
        public int Id { get; set; }
        [Required]
        public string Title { get; set; }
        public string ShortDesc { get; set; }

        // Foreign Key
        public int CustomerId { get; set; }
        // Navigation property
        public Customer Customer { get; set; }
    }
}

 

Contollers and DataContext

Also create a contoller with option “WebAPI2 controller with actions using Entity Framework” (this is one of the coolest feature I love). Select Customer model first and give a new generic DataContext class and controller name. Do the same for Invoice model with the same DataContext class. Rebuild the project.

In the package manager console execute

Enable-Migrations

 

Now expand the newly created Migrations folder in solution explorer and open Configuration.cs and modify Seed() like below.

namespace QuickCustInfo.Migrations
{
    using System;
    using System.Data.Entity;
    using System.Data.Entity.Migrations;
    using System.Linq;
    using QuickCustInfo.Models;

    internal sealed class Configuration : DbMigrationsConfiguration
    {
        public Configuration()
        {
            AutomaticMigrationsEnabled = false;
        }

        protected override void Seed(QuickCustInfo.Models.QuickCustInfoContext context)
        {
            context.Customers.AddOrUpdate(x => x.Id,
                new Customer() { Id = 1, Name = "Anne" },
                new Customer() { Id = 2, Name = "John" },
                new Customer() { Id = 3, Name = "Balmer" },
                new Customer() { Id = 4, Name = "Trump" }
                );

            context.Invoices.AddOrUpdate(x => x.Id,
                new Invoice() { Id = 100, CustomerId = 1, Title = "Paint", ShortDesc = "Blue Color Paint" },
                new Invoice() { Id = 101, CustomerId = 3, Title = "Pen", ShortDesc = "Red Color Pen" },
                new Invoice() { Id = 102, CustomerId = 2, Title = "Shirt", ShortDesc = "Deedat TShirt" },
                new Invoice() { Id = 103, CustomerId = 4, Title = "Paper", ShortDesc = "Waste Paper" }
                );
        }
    }
}

Note the DbMigrationConfiguration is typed with the thing we gave earlier – QuickCustInfo.Models.QuickCustInfoContext (DataContext).

Again in the Package Manager Console

Add-Migration Initial
Update-Database

Thats it, DONE 🙂

Make Sure Things Went Well

Go to the SSH again, open SQLCMD and execute the below

USE quickcustinfo_db;
SELECT * FROM CUSTOMERS;
GO

and if everything went well,

sqlstatement

Whats Next

Host the WebAPI project to a free tier Azure Web Service offering and call the APIs from your Clients to do the CRUD operation.

Conclusion

POC was successful and make this without any expenses is just the matter of hosting the SQL Server on a local Ubuntu machine.

Things are possible and happy coding.

Advertisements