Welcome to my Development Blog

Welcome to my new Development Blog.  This is a place for me to share Development news, tips and experiments.

Hopefully this site will become a useful source of information regarding Development, in various languages, using various applications.

DevOps Home Server – Part two – Software

For the purposes of my own home DevOps server, I decided to use the range of Atlassian software.

Source Code Control Atlassian Bitbucket
Project and Issues tracking Atlassian JIRA
Documentation Atlassian Confluence
Build and Deploy server Atlassian Bamboo
Code Analysis SonarQube

The main reasons for this are :

  • I have used these applications in a previous job
  • They all integrate really nicely with each other
  • They are affordable (even though there are free open source equivalents)

Once they are all set up, and connected together, you can quickly change between them when browsing the web interfaces.

In terms of price, each one of these applications (apart from SonarQube which is free) costs $10 for a permanent license for 10 users.  If you wish to get software maintenance, then you can pay that price every year (actually, its half that for extending the maintenance after the initial purchase), but in my case, the initial purchase is going to be fine, costing not much more than a decent takeaway.

The Virtual Machines

Both VM’s are running on my original Intel NUC Server (i7, 16GB RAM), set-up with Microsoft Windows Server 2012.  I decided to create a separate SQL Server machine for a couple of reasons, one to reduce risk of failure (so one VM is not hosting everything), but also in case I ever need a general purpose SQL Server, it can be used for that as well.

SQL Server

I have installed Microsoft SQL Server 2016 on the VM, opened up appropriate ports, firewall rules etc., and that was about it.  It now sits there chugging away nicely.  I have still yet to sort out any form of backup system.  I will probably set up some automated database backups, but I will also aim to have the actual VM backed up as well.

DevOps Server

The DevOps Server is another VM, although I have given it a big more power than the SQL one purely as it will be hosting a number of applications at the same time, and it will need a fair amount of oomph to keep it all running smoothly.

Installing the software

NOTE: this is not a guide on how to install everything, just an overview of the experience.

The first step was to install Java as most of the Atlassian products are built on Java, after which I also installed Visual Studio as I knew that was a requirement for the Bamboo Build and Deploy server.  I then one by one installed and configured the following applications :

  • Atlassian JIRA
  • Atlassian Confluence
  • Atlassian Bitbucket
  • Atlassian Bamboo
  • SonarQube

Each piece of software required me to choose a program location, and a data directory.  Most software also added a Windows Service as well (although some had to be manually installed by running some command scripts).  Each application would start, and allow me to perform the first time configuration.  This involved the usual initial settings, database connection etc.  Most Databases had to be created prior to running the first time config, and also had to adhere to the softwares requirements in terms of collation settings.  Each Database was given its own specific user for the application to use (a standard sql user).

As I was wanting them all to have the same user database, I elected to use Active Directory, although this had to be set up after the initial install.  Most of the applications required the same AD config, and once I had figured it out for the first one, the others were straight forward.

Once everything was up and running, I was able to connect all of the Atlassian applications together so they show up on each applications hamburger menu, and as they are all configured with the same set of AD users, I am able to switch seamlessly between the applications.

External access

Obviously, as this is all just running on my home network, if I ever have to go anywhere, I don’t want to not be able to access it all, so I wanted to expose it to the outside world.  I am not worried about security, or about having my home broadband thrashed by outside users as its only going to be me using it.

To expose them online, all I had to do was configure up some port redirects on my router, so that any traffic hitting it on the relevant applications ports where forwarded to the correct port on the virtual server.  I configured my router to use a dynamic DNS service (Netgear’s own service) so that my home network could be reached on somehost.mynetgear.com on all of the configured ports.

I then configured an appropriate domain so that the various sub domains would also redirect to the somehost.mynetgear.com.

jira.someexternaldomain.co.uk → somehost.mynetgear.com

confluenace.someexternaldomain.co.uk → somehost.mynetgear.com

bitbucket.someexternaldomain.co.uk → somehost.mynetgear.com

bamboo.someexternaldomain.co.uk → somehost.mynetgear.com

sonar.someexternaldomain.co.uk → somehost.mynetgear.com

Unfortunately, this means that everything is still reliant on ports, so when accessing one of the sites, I would still need to use something like http://confluence.someexternaldomain.co.uk:8090, but I am not too bothered about that.

Internal DNS

Now, one issue that I discovered with this plan was that each application, when trying to internally communicate with the other applications, because all URL’s (on the server itself) where full domains, and not just IP addresses, they would actually do the round trip to the internet and back.  This was not good as it would of caused delays etc.  What I did to fix this was to go on to my Active Directory VM, and set up DNS so that forward lookup DNS settings meant that any machine on the internal network, using AD, would not need to go outside the network for these URL’s.

Performance

In terms of performance, it all seems to work well.  There are times (such as if there is a build job running), that things seem to slow down somewhat, but I would say its no worse than I have seen in a real production environment.  It all works rather well, and it feels like I have my own professional business environment right at home.  The only issue is that the actual server makes a bit of noise. The fan seems to be working pretty hard, but I can just shut the door on my office and forget about it.

In the next blog post, I am going to describe the workflow of using it all.

Portals Support added to CRM Utilities for Visual Studio

Get it on the Marketplace

I have updated my extension to support publishing of files from Visual Studio to Microsoft Dynamics Portals.

The tool now supports publishing files to Web Templates and Web Files, allowing you to use Visual Studio to edit and track changes of your portal related files, and quickly update Dynamics with the appropriate Portal files.

Web Files and Web Templates are simply listed within the Web Resource linker dialog for you to select.  You can then publish the appropriate files within your Visual Studio solution to Dynamics.

If you have installed the extension from the Marketplace, then it should prompt you to update, but if not, you can get it from the below link.

CRM Utilities for Visual Studio

My utilities are now on the Visual Studio Marketplace

Just a quick update to say that all of my Visual Studio extensions are now on the Microsoft Visual Studio Marketplace, and are available to download and install direct from Visual Studio.

Visual Studio Marketplace

Simply go into the Tools menu and choose Extensions and Updates, select Online and search for me, James Hall.  My extensions are the top two in the list.

In theory, if you install them this way, you should get notified of when I update them.

CRM Utilities for Visual Studio – Update to menus, and class generation options

Today I have released an update to the CRM Utilities for Visual Studio 2017 extension.

New features:

Reorganised the menu structure so that the Generate Class options are now grouped together.

Generate Class options menu to allow a custom namespace and class name to be used when generating the class files to represent the Dynamics Entities.

Redesigned the Connection dialog to make it look better, and to include a hyperlink to the instruction pages on this blog.

 

Download
Please note this feature is only available in the Visual Studio 2017 version. This version may still install on VS2015, although I have not personally tested it.

CRM Utilities for Visual Studio – Publish All option

A recent request for a new feature has resulted in a quick update to my CRM Utilities for Visual Studio.

There is now a Publish All Web Resources feature which will publish all files that have already been linked within a Project.

Hopefully this will be very useful for making sure all the Web Resources part of a solution are up to date in CRM before doing a Solution release.

Get it from here

CRM Utilities for Visual Studio – Generating Entity Classes

Most CRM Developers either use, or have at least heard of CrmSvcUtil for generating early bound classes for developing code and using the resulting classes to manipulate CRM data.  I personally do not like working with early bound entities as the resulting class files are huge, and I personally prefer working with the standard Entity Framework for creating and updating entities, and for Linq queries.

Often, I use some helper class libraries that I can use to represent the custom entity names and attributes, so that they can be referenced in code and provide a degree of separation from the actual Schema names and to make code easier to write, and support Intelli-sense.

Something like the code sample below:


public static class Contact
{
    public static const string EntityName = "contact";
    public static const string Name = "fullname";
}

This would then allow you to do the following:

public void createContact()
{
    Entity contact = new Entity(Contact.EntityName);
    contact[Contact.Name] = "Joe Blogs";
    service.Create(contact);
}

I was offered a suggestion by a fellow developer that wouldn’t it be good if my CRM Utilities for Visual Studio allowed you to generate this kind of Class file automatically.  Well, I thought it was a brilliant idea, and so thanks to the wonderful gentleman  of XRTSoft, here it is.

Its split into two options, one to generate classes for your Custom Entities, and one to do the Standard CRM entities.

The resulting file will look something like this:

Notice that for each Entity, it will add the Logical Name, Primary ID Attribute, and the Primary Name Attribute as standard, and then all of the attributes as well.  It will also add sub classes for any Option Sets to allow you to reference specific Option Set Values without having to look them up in CRM.

 

Download
Please note this feature is only available in the Visual Studio 2017 version. This version may still install on VS2015, although I have not personally tested it.

 

GPD Pocket review

I’ve always enjoyed small gadgets. They make me happy. I always seem to have a desire to game and compute on the go, using all manor of devices. I remember when I was a young lad, and I was holding down a Saturday job in a computer shop, selling Amigas, PC’s and games etc. I remember at one point; my boss had a Psion Series 3 sat on a shelf that he was given by a rep as a freebie. He had no intention of selling it or using it himself, and when I showed an interest in it, he took advantage of me. I don’t remember how much he charged me for it, but it wasn’t cheap. Since then I was hooked, moving on up to the Psion Series 5, the Psion Series 7, and countless PDA’s in the years after.

Now, everything is a lot more advanced, but, not as fulfilling as it once was. While waiting for the Gemini PDA to be released (a re-invention of the Psion Series 5, with a full keyboard, but Android, and smartphone like internals), I wanted something that would let me compute on the go, and that’s where the GPD Pocket comes in.

It’s a small 7″ Windows 10 PC, with a keyboard, but in this modern age, runs an Intel Atom based processor and has 8GB RAM to hopefully give it the ability to perform at a decent pace. 128gb of storage is not as much as I would have liked, but it is sufficient enough to be more than useable.

Build quality

The build quality of this device is impressive. It’s a full metal body, no flex, feels cool to the touch and looks like a little mac book. When opening it, the screen can be positioned at any angle you choose, the hinge is solid, and when switching it on, looks amazing. The screen is very bright, the resolution is spot on (it’s a full 1080p display) and is very readable. The keyboard has a nice feel to it and has plenty of travel for such a small device. It has a USB-C port, a standard USB port, a headphone jack, and a mini HDMI port. In such a small device, they couldn’t fit a trackpad in, but instead has one of those lovely nipples common in older laptops, or IBM ThinkPad’s of today.

Overall, I am very impressed with the build quality.

Usability

As for usability, it works very well. The screen is easy to read and not too small, the trackpad works well if your used to these kinds of things, and the keyboard is much better than I was expecting. I had read a lot of reviews about the keyboard being a little difficult to get used to, but I have warmed to it very quickly. The layout is a bit non-standard, with the tab key being in an unusual place, but I am quite used to it, and can type at a reasonable speed. The fan is a bit audible, but nothing too bad, certainly not the loudest I have experienced. I find myself using a mixture of the nipple, and the touch screen (did I not mention it also is touch enabled) to navigate around, and the size of the device means it can be put in your pocket (it still needs a largish pocket). I don’t normally use standby on laptops, but with this, I find I am more inclined to just shut the lid, so I can resume where I left off.

Portability

As already mentioned, it does fit in a pocket, maybe a jacket pocket, but you would not want to stick it in your jeans pocket. It’s the perfect size though for either throwing in a bag, your coat pocket, or even just carrying it. Its no bigger than a paperback book if you hold it in your hand.

It certainly has allowed me to take a computer with me where I would not normally take a laptop, which I like.

Battery life

So far, battery life has been very good. Its rated at up to 12 hours which is most likely very unrealistic. I have not actually done any timings yet, but it feels like I am getting around 6 hours of use. Maybe about 2 days of regular quick bursts of use. The good thing is that it charges over USB-C, so, if I am on the go, I can keep a battery pack at hand to keep it going for longer and can even charge it with a decent smartphone charger. No external power bricks required.

When putting it in to sleep mode, it seems to lose around 1-3 percent of battery overnight. It should be noted though that I was experiencing a few random restarts when in sleep mode, until I set Windows 10 to disable Wi-Fi when asleep.

What I use it for

My use so far has been to do a bit of writing on the go, or when away from another PC’s. So, writing blog posts, stories etc. I also have Visual Studio for some on the go development, which works surprisingly well. The usual Office and Outlook is also present.

Oh, and how about a bit of gaming. Now, I know I am not going to playing PC games on the keyboard (and its too much clart on to connect a Bluetooth controller to it (which does work)), but, how about a bit of retro ZX Spectrum emulation. This works well, and playing some old spectrum games on the keyboard reminds me so much of the good old days.

And how about a quick ZX Spectrum gaming video!

 

DevOps home server – part one – the equipment

 

Ah, DevOps, such a buzz word now.  It seems that everyone wants to bring Operations and Development together, in a harmonious gathering of intellectual minds.  To get in on the action, I wanted to do some hands on development, with a little saucy operation to go with it, and so wanted to experiment with some home server shenanigans.

Why bother, I hear you say, why not just use the existing cloud services I hear you cry.  Well, I really don’t have an explanation, other than to say, why not.  Sometimes, a workplace environment may not be in a position to use the various cloud services, and may have to host everything themselves, so its worth having a bit of experience with such a situation.  So, I bring to you my experience of setting it all up, using Windows Hyper-V.

Firstly, a little bit of background as to what I already had, before I get in to the most recent information with regards to my little home development environment.

Equipment

The existing Hyper-V server

 

I have always had the drive to have my own little server at home, primarily to run my own instance of Microsoft Dynamics, simply because for development purposes, it was too expensive to have an online instance for general experimentation.  As a result, quite a while ago, I purchased a small Intel NUC small form factor PC to run Hyper-V on, and host my development environments.  This was an Intel i7 PC, 16GB of RAM, a 512GB SSD, and a 1TB Hard Disk. It was a small machine, took very little power, and since Windows 10 Pro also provided Hyper-V Virtualisation, there was no need for it to run Windows Server.  On this PC, I set up a Virtual Machine to host Active Directory, and then a second Virtual Machine where I was able to install Microsoft SQL Server, and Microsoft Dynamics.  This provided me with a nice little CRM test environment.  Overtime, I hit a little snag when my requirements overstretched the little machine, as I needed other Virtual Machines to host other bits and bobs, and the initial CRM VM also became bogged down with the amount of different CRM organisations I was running.  It was time for an upgrade.

Hyper-V server, the second coming

 

So, keeping with the excellent experience I had with the intel NUC, I decided to get a new one, but the latest version.  This little baby was the latest i7, it had a maximum of 32GB of RAM, but otherwise was pretty similar to the original one.

https://www.intel.co.uk/content/www/uk/en/products/boards-kits/nuc/kits/nuc7i7bnh.html

Coupled with 32GB of RAM, a 512GB SSD for the operating system, and a 2 TB 2.5 inch Drive, it was ready to receive Windows 10 Professional.

Installing Windows 10 was easy, then adding Hyper-V, and then moving my VM’s across to it.  With the 32GB of memory, I was able to increase the memory of the Dynamics Virtual Machine to give it space to grow, and I still had memory left over.

Enabling the server for remote desktop access enabled me to unplug it from the screen and keyboard, and position it out of the way, simply connected to power and ethernet.  And there it sits, chugging away.

 

 

Servers in action

And this is what they look like.

Enabling OneDrive when some pesky admin has disabled it

On my corporate machine, it turns out that the powers that be decided to disable OneDrive.  I have a personal 1TB OneDrive account which I use for all manor of things, and I was not happy about this, so I set upon trying to find out a solution.

I found two useful things.

  • Local Group Policy Editor

    In the Local Group Policy Editor (Control Panel, Local Group Policy Editor, or just search for group in the Cortana Search Box), look for :

    Computer Configuration\Administrative Templates\Windows Components\OneDrive

    Find the key :

    Prevent the usage of OneDrive for File Storage

    And disable it.  If it is not configured, still disable it.

  • Windows Registry

    In the Registry Editor, look for :

    HKEY_LOCAL_MACHINE\SOFTWARE\Policies\Microsoft\Windows\OneDrive

    Find the keys (one, or both, depending on what appears) :

    DisableFileSyncNGSC
    DisableFileSync

    Change those values to 0.

 

Doing both of these may be enough to allow OneDrive to run again.  However, you may find that upon next boot, or if you connect to your corporate VPN, one or the other may have reverted back.

To try and fix this, within the Registry Editor, I have modified the Permissions on the Key Container (the folder : HKEY_LOCAL_MACHINE\SOFTWARE\Policies\Microsoft\Windows\OneDrive) and denied access to if for some of the users and groups.  I think it has worked, but time will tell I suppose.

 

Easy access to your TFS/GIT folder

This post describes something that I always do when setting up a new PC with Visual Studio.  Its only a minor thing, but it makes all the difference to me 🙂

The amount of times I open up Windows Explorer to navigate to my TFS folder in one day is quite a lot, and I follow these steps to make life a bit easier, especially on Windows 10.

  1. Right click on your main TFS folder and select Properties.

  2. Go to the customise tab, and select Change Icon.

  3. Click Browse and find an appropriate icon file, or, in my case, navigate to your Visual Studio folder and select DevEnv.exe  and select the appropriate icon.
    C:\Program Files (x86)\Microsoft Visual Studio\2017\Community\Common7\IDE\devenv.exe

  4. Hit Apply, and your TFS folder will now have a nice new icon.  To make it easier to access, you can also right click it and select Pin to Quick access.

    This will mean it will always show up at the top left of Windows Explorer in your Quick access panel.

This can be applied to any folder of course, not just your TFS folder, as long as you have an appropriate icon.

One thing I would say though, don’t go customising all of your folders with nice new icons.  I’m not sure, but I guess it may have an effect on performance.  It probably caches the icons, but just in case, be careful.