Friday 30 November 2012

My Life in Computer Games: Part 1

I’m coming to a realisation that has slowly been dawning on me for a while now: I’m going to give up playing computer games. For someone who has been playing them since childhood that’s quite a big thing to say, but it is for several reasons.

First and foremost I do not have the time anymore. I’ve got a full-time job and a family to support so free time for me is extremely precious. I’m currently playing The Legend of Zelda: Skyward Sword, but I’ve been doing that since Christmas 2011! And I still haven’t finished! So far I think I’ve clocked up about 40 hours gameplay time but that is very much spread out across a year, playing maybe a couple of hours a week at best. Finding the time for anything more involving than Angry Birds is simply difficult these days.

But it’s not just a matter of finding the time anymore, I actually don’t feel that bothered about computer games anymore. In the limited free time I have left I would rather actually be doing something constructive (such as writing this blog) and learning more about programming. I’m still interested in how computer games are made – I found the code reviews that Fabien Sanglard did on the Quake and Doom game engines really interesting – and I still appreciate them – I think Halo 4 looks amazing. But playing them? Meh.

So I thought I would make a list of all the memorable games I’ve played during my life so far and reminisce, which is going to span a few posts.

The ZX Spectrum

When I first asked my parents for a computer to play computer games I thought I would be getting a Nintendo or a Sega console. Instead what I got was a ZX Spectrum +2, so not quite what I was expecting.

In the end it did shape my entire career by introducing me to programming, but as a game machine all I remember was having to load tapes which took minutes – and also provided a lovely whining noise and hallucinogenic loading screen – only for me to play it for 30 seconds and give up because I didn’t find it entertaining. Maybe the next one will be better… (wait another 5 minutes to load the next tape).

A ZX Spectrum loading a program. Wow, my head hurts…

So I gave up on the Spectrum and got myself a Sega Master System, which then led to a Sega Mega Drive, which then led to…

Sonic the Hedgehog

Back in the 16-bit days you were either with Nintendo or Sega, Mario or Sonic. I chose Sonic and have many happy memories of those games, because they were fast!

Sonic the Hedgehog 2 was my personal favourite, the video above showing one of my favourite levels due to the sheer speed you can crank up to – so fast that the screen sometimes can’t even keep up with you!

But the real gem of the series was Sonic 3 and Sonic & Knuckles. Both individual games on their own were great, but the “lock on” technology that Sonic & Knuckles brought made them combined into an amazing experience – extending the Sonic 3 game, or by joining Sonic 2 to it you could turn an old game into a brand new experience.

Recently I actually wondered how they even did the “lock on” bit; how do you turn one game into three games? And how do you take an old game which wasn’t even designed for this kind of thing and make it into what is effectively a brand new game? Turns out that actually it was a clever ROM trick, by joining ROM chips together to make a new one; this long forum post explains it in far more detail.

The 32-bit Years

After the Mega Drive I got a Sega Saturn – not sure why in retrospect and not a PlayStation which was the latest hotness at the time. There were some good games such as Sega Rally and NIGHTs Into Dreams but this was the advent of 3D graphics and things were just starting out. One game caught my attention though which made me rethink everything…

GoldenEye

Need I say more? Oh, alright then

I remember going round to my friend’s house and they were playing GoldenEye. I had a go – the Nintendo 64 controller looked a bit weird to hold – but I started playing it and I realised quickly that I was completely hooked on the multiplayer. Every console before the N64 supported two players but this one could support four, meaning that multiplayer shooters actually made sense outside of a PC.

So I got an N64 with GoldenEye as my first game for it. The single player missions were great with lots of challenges added as the difficulty ramped up, but my single defining memory of this game was spending nearly all my free time between A-Level lectures playing deathmatch games against my friends. And beating them. Over and over again.

Super Mario 64

Now that I had a N64 I wondered what other games to play on it, so of course I got the N64 killer app: Super Mario 64.

Now before then I hadn’t actually played a Mario game before but this game was great. I remember just wandering around the castle hub-world doing all sorts of acrobatics simply because I could and the analog stick finally let me move around in 3D which made sense. Although I have to admit the camera controls did suck.

Throw a dinosaur four times my size into a bomb? No problem!

I also came to realise that games that Nintendo made really were top quality; they were fun, inventive and imaginative.

The Legend of Zelda: Ocarina of Time

This game came out during Christmas 1998 and was snapped up by just about everyone at the time. I was very lucky to get this on Christmas Day and spent the next two weeks solid becoming immersed in it.

Just like Mario I had never played a Zelda game before so I never experienced the sense of adventure previous games conjured up, and this did feel like a real adventure. It introduced concepts like “locking-on” to your target during combat so you could always see them (something every game afterwards copied) and split the world into two times: current and future, meaning you played as Young Link and Adult Link. And you got to ride a horse!

Looking back I simply remember the variety of the gameplay, the sense of scope (looking into the distance at the volcano or riding across Hyrule field), and the final climatic battle with Ganondorf.

The Legend of Zelda: Majora’s Mask

The sequel to Ocarina of Time, this was like a Zelda game and also not like one at the same time. With the benefit of hindsight I think this had a lot more emotional depth than Ocarina.

The main selling point of this game was that you had to save the world from destruction in just three days (game-time, not real), but of course you couldn’t do everything in that kind of timeframe. This meant that a “Groundhog Day” concept was used: you could relive the same three days over and over again to make progress and also watch the lives of each inhabitant of the world happen over and over again, making notes of key points in time when they would do certain actions. It was a bit harder than Ocarina and there was an added sense of urgency (what with an evil-looking and ever looming moon constantly visible in sky) but still worth playing.

That doesn’t look promising. Image courtesy of http://www.ogeeku.com

The emotional depth I mentioned though was something I wasn’t expecting; every character had a backstory and it was your job to help them. One little girl’s father had turned into a monster yet she tried to shield him from the world to protect him. A baby had lost his father and brought depression to everyone around him. Probably the most poignant one was having to re-unite a wife and husband – the longest of all the side quests; you eventually did but only just in time, by which point the moon was nearly about to crash into the world – you brought them back together long enough for them both to properly say goodbye to each other. This adventure was not about stopping a singular enemy, rather it was about healing the wounds of the people of the world.

Next Time

There’s still a lot more to go, so stay tuned for the next part.

Wednesday 14 November 2012

Forcibly Uninstall Apps from Windows 8

I took the plunge a couple of weeks ago and upgraded to Windows 8. It takes some getting used to but overall I’m mostly happy with it. Except when it did something very unusual with the Metro apps (I refuse to use another name since everyone by now knows what “Metro” means).

For those who don’t know when updates are available for Metro apps they appear in the Windows Store, a little number appears on the live tile. One day I found that there were updates available for all the built-in apps such as Mail, Calendar, Bing etc. So I dutifully went to start the update process, found it was taking a while as there were quite a few, and just left it to it.

To be fair I might have messed up my system myself but while this was happening I thought “I don’t need Travel, Sports or anything like that. I’ll just keep the ones I’m interested in”. So I uninstalled the apps I didn’t care about whilst they were still updating.

In retrospect I should have known better – I’m effectively classed as a power user! As a result of my blunder I ended up with nearly all the apps that were updated being wiped from the system, including some of the ones that did matter to me. Here’s the interesting part though: when I went back to the Windows Store to try and install them again I couldn’t. This is what I saw:

So the Store thinks it is already installed. So why when I search for it does this happen?

Something in Windows clearly thinks I have the app installed. Now in the past I would have known to check certain folders like Program Files or delve into the registry to see if some sort of metadata was lying around, but Windows 8 changes things up a bit: these Metro apps seem completely self-contained and sitting in the WindowsApps folder which is quite secure and doesn’t even let me read it by default. So how can I remove these hidden settings to get Windows to play nice again?

Fortunately after some Googling I found the answer on this forum which I will explain in detail below. In this example I’m going to re-install the Bing app despite the Store telling me that I already have it.

First you need to open up an elevated PowerShell console. Simply:

  1. Press the Win-key to get you to the Start Screen.
  2. Start typing “Powershell”
  3. Right-click on “Windows PowerShell” and click on “Run as administrator” in the menu that appears at the bottom of the screen.

Now you can run this cmdlet to see what Metro apps Windows considers to be installed:

Get-AppxPackage -allusers


This should give you a list like in the image below:




Now that you’ve got the details of the app you can run another cmdlet to remove it. In my case I did this:


Remove-AppxPackage Microsoft.Bing_1.5.1.251_x64__8wekyb3d8bbwe


You’ll notice that you have to use the PackageFullName value that is provided in order for the cmdlet to work.


Once that was done I went back to the Windows Store and checked that it worked:




In my case I simply repeated these steps until I had cleared up my mess and managed to get everything I wanted back.


I think it’s quite good that there is actually a way to uninstall something in an automated fashion. Regardless, this trick got me out of a hole so I’m sure someone else might benefit from re-learning Windows tricks like I’m doing.

Thursday 8 November 2012

The Urge to Rewrite Code

I recently read a fantastic article on Ars Technica about the inner workings of the new WinRT technology powering Windows 8. It’s a lengthy article but it explains how we got all the way from Win16 to Win32 to OLE to COM to .NET and eventually to WinRT and is well worth a read if you’re interested in Windows development and its history.

What got me thinking though is that it gave me a snapshot into Microsoft’s process of keeping Windows and its technology up-to-date yet also be able to support a vast legacy of applications that have been running sometimes for the past 20 – 30 years. What you realise is that to support that legacy and keep pushing forwards it basically built everything on top of one another. If you were to start from the latest WinRT API’s and strip away all the layers of strata you would eventually hit the ancient Win32 API and kernel, the very same system that has been powering Windows since Windows NT. The same can be said for .NET and COM; it doesn’t matter how new or fancy the latest technology trend is, eventually they all become nicer wrappers over what came previously.

This is interesting to me because I’ll bet every single developer out there has had one thought cross their minds at some point in their career: “I need to rewrite this”. Maybe you have a codebase written from another era or you’ve inherited code that looks like a monkey tap danced on a keyboard, sometimes we all fall prey to thinking that we need to invest time in rewriting huge chunks of code (or entire systems) because, obviously, we can do a better job and that the end result will be “better”.

This is dangerous!

I can speak from experience plus cite any number of references saying that taking on a rewrite of a codebase, especially an enterprise/commercial one, is just a bad idea. What you are effectively suggesting is to take a (mostly) working system, spend 6 – 12 months (maybe more) ditching it and starting all over again, and end up with the exact same system you started with plus countless additional bugs you’ve introduced due to human error and/or lack of domain knowledge. Also, all that time you wasted rewriting code meant that you couldn’t do any current development work, meaning your competitors have now charged ahead of you with brand new features that you will never be able to keep up with.

All of this so that your code looks better, a matter that no-one else but you cares about.

Now I am being a bit extreme here. Of course there are times when you have to rewrite something in order to progress forward. Maybe your code is so stuck in the dark ages that adding new features becomes increasingly time consuming or complex, maybe impossible. Yet I’ve learned over the years that you simply cannot ditch what you already have; your customers don’t give two hoots how you managed to kludge together their feature, the fact remains that it was done and it works so breaking it now is not an option.

So what can be done to refactor code effectively? Below are some ideas that I thought of and some of which I even try and implement myself.

Do Nothing

By far the simplest strategy as this requires no work at all! Simply learn to live with your codebase, quality be damned.  If you can overcome your initial feelings of revulsion at the spaghetti code you see daily and just accept it for what it is you might overcome some hurdles.

Of course doing nothing also means making no improvements so it’s quite a trade-off, but as the old saying goes “if it ain’t broke, don’t fix it”.

The Big Bang Strategy

Image courtesy of http://memegenerator.net/

Or the polar opposite of doing nothing is doing everything in one go, but as I’ve already said this is very extreme and hardly ever needed as there are better ways of improving your codebase without greatly affecting anything else.

The Microsoft/Onion Strategy

What I’ve seen Microsoft tend to do is build layers upon all their existing technologies so that the next layer up has a better API than the one below and each new layer will handle the fiddly, lower level details so you don’t have to.

For example, consider when .NET first came into existence which introduced Windows Forms. This was meant to replicate the drag-and-drop style of development that Visual Basic programmers have long been used to. But do you think that all the framework classes designed to handle windows and controls were written from scratch for a brand new, untested technology? No, Windows Forms was simply an easier to use wrapper over the interop’ed Win32 code because it already worked; why re-invent the wheel?

Of course once you’ve introduced these layers and made sure they are working effectively you could start to clean up the lower layers or possibly even remove and replace them so that you don’t need so much API coverage; that is assuming of course you can remove all the dependencies on low level code.

The Side-by-Side Strategy

This is a refactoring strategy I tend to use myself. Let’s say you have a feature that, for whatever reason, you are going to rewrite. What I do is actually never touch the old code and instead create a separate layer of classes alongside the existing code to replicate the same functionality but written differently; usually I separate these classes with appropriate namespaces.

Now I can work on the new code whilst the old code can still be deployed if necessary and also not affect my other team member’s builds. Eventually I will have fleshed out the rewritten code enough for calling sites to start using the new code which will then phase out the old. Once all references to the old code have been removed, you can safely delete the old code from your codebase. This might take quite a while to achieve fully but it is certainly a lot safer than starting from a blank canvas with nothing to show for a long time.

I also use this strategy with the ObsoleteAttribute to make it clear that code is old and should no longer be used; it also helps find all the references to the old code by giving me compiler warnings that I can work my way through.

The Inheritance Strategy

As an alternative to having old and new code side-by-side, you could also implement it top-to-bottom within an inheritance hierarchy. The new code would be contained in a base class while the old code would derive from the new and still keep it’s existing API. This means that legacy code could in theory be passed into functions which require the new class and it would still work.

Conclusion

There are many alternatives to refactoring code in one large chunk – I’m sure there are also many other strategies thought up by people far cleverer and more experienced by me. Essentially I have learned from my career that the “big bang” approach never works out well and a more long-term, slower strategy usually gives the best results.