The road to 200Mb/s

Speed test

When we moved to our present house six years ago, the only thing that I missed about our last one was the internet connection. That house had been one of the first to get BT Infinity and I’d quickly got used to 50Mb/s speeds. We only moved 1.7 miles, but suddenly we were cast back into the dark ages with a connection speed of 0.8Mb/s.

This clearly wasn’t going to work, so I signed up for an expensive satellite connection. In theory this offered 20Mb/s speeds, but in practice it was about half that and came with 800ms latency (no good for gaming) and nasty data caps (no good for Netflix).

Things improved three years ago when BT upgraded the cabling, but the distance from the exchange still limited the fastest speed we could get to about 12Mb/s. Upload speed was particularly constraining at 1Mb/s, which is a big problem with a house full of people uploading their photos and backing up their phones to the cloud.

Back in February 2015, I went to Israel to look at the tech/startup community and we were accompanied by friends from the BT Innovation group. Inevitably, the BT folks grouched about their airline problems and we complained about our broadband issues. This set in motion what proved to be a two and a half year joint quest to sort out my internet.

Many possible solutions were explored with BT over the ensuing couple of years, but it was frustratingly difficult to get real options. I think one of the problems was the regulatory restrictions that BT is forced to operate under. The people who could do the actual work (BT Openreach) weren’t allowed to talk to end customers. The people who were couldn’t fix my problem. Somehow we went round and round in circles for months, which stretched into years.

Finally, I got some real options on the table and decided to get a dedicated leased line from BT Local Business. A full fibre to the premises service, which understandably came with an associated high price tag and required me to register as a business. Rather than go for the “Managed Service” option, I decided to go for the slightly less expensive “wires only” option. “How hard can it be?”, I thought.

Well actually quite hard! BT’s “user guides” seem to assume that you are a qualified telecommunications engineer. Unusually, the internet wasn’t much help. So I thought I’d make a few notes on how to get things working for anyone who is mad enough to attempt the same thing.

I’d opted for a 200Mbps speed, delivered over a 1000Mbs fibre bearer. The following section in the BTNet “No Router Option (NRO) User Guide” describes how to connect things at the customer end:

4.1.4. 1000Mbps

1000Mbps services are presented to the customer as 1000Mbits/s Gigabit Ethernet conforming to IEE802.3z[25]. The customer connection is via a port on the WES1000 NTE.

The EAD1000 NTE customer interface is 1000Base-SX optical presentation via a Multimode dual LC optical connector as specified in the Gigabit Ethernet IEEE802.3z[25] specifications.

The customer must provide the necessary fibres to connect their equipment to the BT NTE.

The optical fibre patch cords to be used must be 850nm wavelength, 62.5/125 or 50/125 micron multimode fibre with LC connectors. The maximum fibre length is 550 metres for 50/125 micron or 220 metres for 62.5/125 micron.

It took me quite a while to decipher this and figure out what I needed to do.

Step 1: Which port do I need to plug into?

“WES1000” stands for “Wholesale Extension Service 1000”, with the 1000 referring to the 1,000Mb/sec speed of the line.

“NTE” stands for “Network Terminating Equipment”, which at least in my case was a box made by ADVA with a model number of FSP150CP FSP-ORNT-11-B. Here is a picture of one, which I have annotated to show the “Multimode dual LC optical” port referred to above.

What they mean by a WES1000 NTE

Step 2: What cable do I need?

You need a fibre cable like this one, shown below.

Duplex multimode fibre cable with LC connectors

Step 3: How do I turn this into something I can connect to my router?

You need a media convertor, which will connect optical fibre to regular copper 1000Base-T. I used a TP-LINK MC200CM Gigabit Multi-Mode Media Convertor.

Media convertor

You will also need a 1000Base-SX module like this one to provide the right socket for the fibre cable to plug into.

1000Base SX Module

Step 4: Connect your router

All that remains is to link the media convertor to the WAN port on your router with a standard ethernet cable and configure your router to use the static IP addresses that BT provides you with for your router, the gateway and BT’s DNS servers.

And that is all there is to it. 200Mb/s up and down and a happy household.

Travel Statistics

Travel stats. We all love them. But where’s the best place to go to find out where you’ve been?

IAG, my employer and owner of Aer Lingus, British Airways, Iberia, Vueling and the newly launched LEVEL has been running a corporate accelerator programme for startups, called “Hangar 51“. I’ve been sponsoring the “Data Driven Decisions” category, and one of the startups I’ve been working with is esplorio, an automated travel journal service.

One of the things IAG and esplorio have been working on together is to offer our customers the opportunity to link their esplorio and BA Executive Club accounts, to give them a combined view of their data. Whilst we are sorting out the technical details, I thought I’d do a one person “proof of concept” by manually consolidating my own data.

Esplorio is a relatively young service, but it enables you to link to your Facebook, Twitter and Foursquare accounts. This means that I have data in Esplorio going back to 2011, which was when I joined Foursquare.

My BA data starts in 2004. This is when BA thinks I joined the Executive Club. I’m pretty sure that I joined before this, but maybe BA knows me better than I do.

According to esplorio, I’ve been to 30 countries and have travelled 454,648 miles. BA thinks I’ve been to 24 countries and have flown 351,818 miles. Whilst the two data sources agree on 21 countries, BA missed 9 that esplorio had and esplorio missed 3 so the true total is 33. Which I think demonstrates the power of combining data and also makes me think I’ve spent too much time on a plane.

Frustrated

Where is the missing external keyboard incorporating the new Touch Bar

Where is the external Touch Bar keyboard?

OK, I’ll admit I’m what some would call an “Apple fanboy” and I am inclined to give them the benefit of the doubt when it comes to some of their more controversial decisions. I’m certainly prepared to accept higher prices and being required to regularly upgrade peripherals to the latest standards if that gets me a better product. For example, dropping Firewire for Thunderbolt or 30 pin for Lightning connectors.

But I have to say that recently Apple has been annoying the hell out of me. Not yet enough to get me to switch away to another operating system, but certainly enough to make me doubt the competence of the company’s management.

Back in 2010, Apple released the 27 inch Thunderbolt display. I bought one in 2012 for £750 and loved it. If they had upgraded it every 2-3 years, I would have happily given them another £1,500+ by now. They didn’t and have now killed it, instead promoting LG and Dell displays. If they believe that these displays are good enough, why the hell wouldn’t they rebadge one of them and earn a markup and keep that Apple logo front and centre for their customers? If they don’t think the quality is good enough to put their brand on, why are they willing to promote them as the preferred solution?

They neglected the Mac Pro for years and then Phil Schiller finally introduced a new one in 2013 with the line “can’t innovate anymore, my ass”. They have completely failed to update it ever since. Phil’s ass has a lot to answer for at this point.

The Touch Bar introduced yesterday on the new Macbook Pros is a decent new innovation. So why didn’t they launch a new desktop keyboard incorporating it? It would have been straightforward from a technical point of view and would have instantly added value to their existing lineup of desktop machines. A missed opportunity that suggests a culture of neglect for the Mac.

I’m also getting fed up with the dropping of ports. I’m OK with dropping old ports for which there is a better, more modern equivalent. I’m also somewhat sympathetic to the pruning of ports where weight and space are at a premium. But the new Macbook Pros couldn’t squeeze in an SD slot? Or an HDMI port? Neither of these has a more modern equivalent. And whilst USB-C is the successor to standard USB, how problematic would it have been to include at least one for a generation of Macbook Pros whilst in transition? It’s not even as if Apple has come up with any great external options. A drawer full of dongles is a really bad and inelegant solution.

Overall, I can only conclude that the current Apple management is either out of touch or getting the short term / financial considerations out of balance with the long term / customer value ones. Phil Shiller’s ass deserves a kicking and Tim Cook needs to listen to the views of his increasingly disillusioned loyal users.

A Swift update

A Swift update

A Swift update

Once again, I failed to live up to my good intentions to update this blog more in 2015. So as a new year starts, I thought I would do a few “catch up” posts covering events from last year.

The first one of these is a catch up on the coding front. I’d been playing with Swift right from the beginning, but early attempts at rewriting some parts of my Mac Finances application in Swift rapidly convinced me that Swift just wasn’t ready and I reverted to Objective C. However, I continued to play with Swift on the side as I do really like the language.

The advent of Swift 2.0 prompted me to ‘take the plunge’. I decided to completely rewrite the application in Swift – not because this is a good idea, but because I felt this would be the best way to really learn the language. I took me about 5 weeks to rewrite 12,700 lines of Objective C, working in my not very large amount of spare time, which is either a long time or surprisingly quick, depending on your perspective. It probably took me another couple of weeks to squash the final bugs that I had inadvertently introduced during the process. Overall, I’d say the experience was quite positive.

One of the things that I was intrigued to discover was whether Swift would produce more compact code, so I keep track of things as I converted things over. The outcome was a 13.6% reduction with a resulting Swift code base of 11,000 lines. I did make a few functional changes and refactored some of my earliest code along the way, so I can’t guarantee that this was all due to the language, but I kept track of the reduction as I went and it did seem pretty consistent. The reduction was only a little larger than number of lines of code in the objective C header files, so I think that elimination of these is probably the main saving. But what struck me the most was that the final Swift code was much cleaner and more elegant than the Objective C version and the “proof of the pudding” is that I would be really unhappy to have to switch back now.

Tech hunting in Israel

The Old City, Jerusalem

After three weeks catching up with ‘the day job’, I’m finally getting round to writing about a recent tech scouting trip to Israel. Four days of meeting with entrepreneurs, investors, startup companies and other participants in the Israeli technology and innovation ecosystem.

I had high expectations for the trip. Israel has acquired a reputation as the second most important centre for technology and innovation (the undisputed global capital of course remains “The Valley”). My already high expectations were exceeded and I came away deeply impressed by what has been achieved there. But overall, the thing that impressed me most was the sheer entrepreneurialism and energy of the people that I met. The willingness to take risks in pursuit of big rewards, and to treat “failure” as a opportunity to do better next time, drawing on the lessons learnt.

The UK has much to learn I think from Israel about how to foster and support innovation and create the jobs and companies of the future. There are lessons for governments, for companies and for individuals and I’d definitely recommend a trip.

Oh… and Jerusalem is pretty amazing too!

Confessions of a closet coder

My first computer

My first computer

I promised myself to post more regularly on this blog in 2014 and after a good start, I’ve let things slide a bit recently. Laziness apart, one of the main reasons is that I’ve been busy on other projects, most notably learning to write Mac and iOS applications. “Why on earth have you been doing that?”, I hear you ask. Well if you want to know the answer, read on. However, I should warn that things are going to get a little geeky, so look away now if you are not into that kind of thing. You have been warned!

I’ve always loved programming. My first experiences, as for many others of my generation was with BASIC. In my case, it involved typing programmes into a teletype machine at school, dialling into the local council mainframe. Back in those days, “saving” your programme meant punching holes in a roll of paper tape.

Ease of use regressed further when I got my first personal computer, a Sinclair MK14. Not only did I have to assemble the machine myself, it had to be programmed in machine code, and I don’t mean assembler. I mean inputting the hexadecimal numbers that represent each instruction. Initially, there was no way to save your programmes at all – you had to type them in again after each reboot, no small task using a 20 key membrane keyboard. The only plus point was that with only 256 bytes of RAM to play with, that’s a maximum of 516 hexadecimal digits to type in.

Over the years, I’ve always tried to keep my hand in and I’ve used PASCAL, Visual Basic, Javascript, PHP, Applescript, BASH and Python. I dabbled at times with C, the low level nature of the language appealing to the machine code hacker in me. But until recently, I’ve never attempted to produce any real world programmes using the language as it was all just a bit too much hard work.

About 6 months ago, I decided to learn Objective C – an object oriented version of C that is used by Apple to develop Mac and iOS applications. I was driven by a mixture of intellectual curiosity and a desire to be able to write real Mac and iOS applications for my own use. Specifically, to replace the 45 sheet Excel spreadsheet that I use to manage my personal finances with something easier to use and maintain (I’ve tried but never liked any of the personal finance apps available in the market).

Objective C and Apple’s Cocoa frameworks have a steep learning curve, even for someone with experience of other languages. The learning resources that I found invaluable were the books from Big Nerd Ranch, video tutorials by Simon Allardice on Lynda.com and the Q&As on Stack Overflow. However, despite this learning curve, after six months my personal finances spreadsheet has been retired. The new Mac app has advanced to the point where it does the job much better. I’m not saying that you’ll be able to buy it on the app store any time soon, since it’s definitely an app targetted at a one person niche market segment. But overall I find it quite amazing that the development tools have now advanced to a point where a hobbyist like me, coding in my spare time, can learn how to build a functional Mac application in such a relatively short space of time.

The world of technology being what it is, just as soon as I’d “finished” learning Objective C (in as much as you can ever finish learning anything), Apple duly announced at this month’s WWDC developer conference that they were replacing Objective C with a new language, Swift. Having spent a couple of weeks playing with it, including experimenting with rewriting some parts of my finances application in Swift, I can say that there is much to like about the new language. To me, it brings many of the ease of use attractions of the Python language, whilst retaining the speed advantages of a low level language built on C. It solves many of the “WTF” moments I had when learning Objective C. To give one example, to add two decimal numbers A & B together in Objective C you have to use the syntax [A decimalNumberByAdding: B]. Swift has operator overloading so even for non primitive data types like NSDecimalNumber, you can use the syntax A+B. They’ve also brought across one of the great things about Python – an interactive mode where you can type commands and get an immediate response without needing to compile and run your code first, a real benefit when learning a language.

So all in all I think Swift will make it even easier to get into Mac and iOS application development. However, just at the moment it presents some additional challenges. Swift is still in beta and you still need to learn Objective C if you want to write apps today. It will take some time for the books, training videos and online Q&As that I found so useful to catch up. It has also given me a new language to learn. It’s a good job I like learning new things!

Olympus OM-D E-M5 – First Impressions

M5-OM-D Test

Olympus 45mm f/1.8 lens at 1/3200s, ISO 200.

I’ve been hearing a lot about the advantages of the micro four thirds system, especially for travel where the size and weight advantages of “going mirrorless” are most clear-cut. Having recently booked a holiday to the Maldives and realised that the sea-plane transfer would come with a 6kg cabin baggage restriction, I started to look into the options with more purpose.

After an enjoyable few days researching gear (always a guilty pleasure), the Olympus OM-D E-M5 stood out as being the best match with what I was looking for, which was a camera system that gave me the weight reduction I was looking for whilst making the fewest compromises on image quality compared to my Canon 5D MkII. I’ll have to admit that having been a proud owner of an Olympus OM-1 as a teenager many years ago, the retro styling was a clincher.

After a few days testing out the camera, my overall impression is very favourable. On the weight front it certainly delivers. The body and lenses are generally around half that of the Canon full frame equivalents. The benefits at the telephoto end are particularly striking, given the 2x crop factor of the smaller sensor. The Olympus 70-300mm has a full frame equivalent focal length of 140-600m, giving it slightly more reach than the 580mm I get with my Canon 400mm DO lens plus 1.4x extender. However, it weighs in at a mere 620g, compared to the Canon combination’s 2.2 kg, 3.5 times heavier.

OM-D vs 5DMkII

Comparing Canon 5D MkII with Olympus OM-D E-M5

It was pretty straightforward to find lenses which covered the range of focal lengths I was looking for, in a mixture of zooms and prime lenses. Generally, the zooms are more versatile but to get the sharpest image quality, best low light performance and shallowest depth of field, you need the fast prime lenses. These have the added benefit of being very small and light. Here are the lenses I selected after a few hours studying the forums (I know it should be fora, but nobody would understand me if I said that).

Olympus lenses

Front to back: 45mm f1.8, 9-18mm f4.0, 12-50mm f3.5-6.3, 40-150mm f4.0-5.6, 75-300mm f4.8-6.7

The only lens that I bought which is not pictured above is the one I used to take the picture, the Panasonic 20mm f1.7 pancake lens. As you can see, it can give you great shallow depth of field shots, although not quite matching what can be achieved with my Canon 50mm f1.4 on a full frame camera.

Overall, the image quality of the body and lens system is excellent. I haven’t really tested low light performance properly but it looks to be not far off what can be achieved with the 5D. In good light, the gap reduces further so on the primary objective of reducing weight with minimal compromise to image quality, the OM-D really delivers everything I was hoping for.

In addition to weight, there are several other ways in which the OM-D improves over the 5D. The fast sequential shooting mode is very impressive at 9 frames per second, compared to only 4 fps on the Canon. Likewise, the exposure bracketing mode seems nearly instantaneous. I love the ability to view a live histogram whilst framing the shot, making exposure issues much easier to avoid “in camera”. Horizontal and vertical level meters are another useful addition. Finally, the art and picture modes are a great source of creative inspiration, enabling you for example to shoot black and white photos and see what you are going to get in the viewfinder. If shooting RAW, you get a jpg with the effect added and an unaltered RAW file, giving you the best of both worlds.

Grainy Film art filter

Testing out the ‘Grainy Film’ art filter

Video is another key area for me. After discovering the beautiful quality of footage that you can get from a DSLR fitted with quality glass, it is really hard to be satisfied with video shot with a consumer camcorder. However, the lack of autofocus in movie mode on the 5D makes shooting video a real challenge. From what I’ve seen so far, the E-M5 can shoot video which comes close to matching that from the 5D, with the added advantage of autofocus. I found the autofocus to be generally good, but it did have a tendency to hunt a bit when presented with a moving subject, especially when shooting with a wide aperture. But overall, great quality and much easier to use. The main shortfalls on the video front are a lack of frame rate options (you are stuck with 29.97 fps if you want 1080p) and the lack of an external mic input. I’ve ordered the SEMA-1 microphone adapter which will rectify this, but at the cost of occupying the hot-shoe, which means you cannot mount an external microphone to the top of the camera.

Other gripes? Remarkably few. The smaller size does present some disadvantages from a handling perspective. I’m considering whether to invest in the battery grip as a result. The electronic viewfinder, whilst good, can still not match the optical viewfinder on a DSLR. The build quality is a bit lower in a few areas, but the price difference is also big of course.

In summary, I’m not going to be selling off my Canon gear just yet, but for the many occasions where size and weight are important considerations, the OM-D E-M5 will be my camera of choice.

The challenges of digital engagement for business leaders

Text100 Report

I woke up this morning to a tweet from Text100, a global PR agency focusing on technology and digital lifestyle. Apparently, I am one of the most ‘digitally engaged’ FTSE100 directors and have been featured in their recent report “How top FTSE 100 executives are engaged in social and digital media”.

Having read the report (well, you would, wouldn’t you?), the most surprising thing is the inactivity of almost all the other directors. Why is this?

The authors comment that “any busy professional who’s tried to regularly maintain a personal blog will understand how hard it can be to find the time to write regular posts”, and I can certainly vouch for the fact that this is definitely a factor. They also point out that “Due to the environment of heavy regulation and intense and public scrutiny that the financial services industry operates in, few executives from this sector are willing to participate in social media since the associated level of risk is deemed too high”. I agree that this is a significant factor for business leaders, and not just in the financial service industry. The downside risk of being found to have “said the wrong thing” can seem to outweigh any possible upside of publicly participating in the online world in a personal capacity (better to leave it to professionals in the communications department, surely?).

But I agree with the authors that this risk adverse approach by executives is very short sighted, placing their own careers and the future of their companies at much greater long term risk. My approach has been driven by a mixture of fascination for the technology, and a strong belief that the only way I can remain current and truly understand how the new social media world works, is by fully participating in it myself.

And of course the reference to the fact that my blog is “infrequently updated” was a bit of a spur to compose this post!

DRM rant

Simon Pegg in Mission Impossible Ghost Protocol

I have a complaint. The media companies and broadband internet companies have consipired to make me break the law and contribute to global warming. They are also wasting my time. And all for no good reason.

I can buy a high quality encoding of the latest films on blu-ray. I like it that they usually include a DVD version and a “digital copy”. When the media companies first started doing this, I was thrilled. Finally, I thought, the dinosaurs have caught up with the fact that increasingly people want to watch their media “on the go”, on laptops and iPads. Or set up a media server at home, streaming content around the house over WIFI. But to my dismay, I then discovered how low the quality of the digital copy is. Let me use the example of a recent acquisition, Mission Impossible: Ghost Protocol (an excellent film by the way).

Format Resolution
Blu-ray 1920×817
DVD 1024×436
Digital copy 853×354

These are the output resolutions on the screen, excluding the black bars. This flatters both the DVD and the digital copy through the use of non-square pixels. The actual number of horizontal pixels in the DVD and digital copy files are really 720 and 640 pixels respectively.

The digital copy maches the quality offered as a purchased download from iTunes, which is worse than DVD. Why? With the latest iPad and Macbook Pro having screen resolutions higher than even the blu-ray file, it certainly isn’t because the quality is “good enough”.

Presumably, one reason is to limit the download sizes. But why make you download the “digital copy”, rather than include it on the disk? Forcing the customer to download the file unnecessarily uses up often scarce bandwidth.

The answer given to these questions by the media companies of course is “anti piracy” and “digital rights management” or DRM. But does all this inconvenience being inflicted on the honest consumer actually prevent piracy? Of course not. It is straightforward for the knowledgable (which certainly includes the professional pirates) to rip the high resolution blu-ray files from the disk. There’s a great guide here for example. It is even easier to rip the DVD.

In my own case, the distance of my house to the telephone exchange forces me to use satellite broadband, which comes with an impossibly constraining bandwidth cap, even with the most expensive plan. So even if I was prepared to put up with the lower quality, downloading movies from iTunes is simply not an option for me. To watch the content I have purchased at good quality on my high resolution iPad requires me to waste time and energy ripping the blu-ray files and re-encoding.

I suppose that the approach of the movie companies does reduce “casual piracy”. Making it hard for most people to make a high quality copy for their friends may result in a few more blu-ray disks being sold. But somehow, I can’t see how taking such a customer unfriendly and backward looking approach can be in the long run interests of the movie industry.

It certainly annoys the hell out of me!

Instagram

Hyde Park

Hyde Park

I’ve been having fun taking and sharing photos with Instagram. You can only post photos using the dedicated iPhone app. The low resolution and the creative filters it prompts you to apply really encourage you to concentrate on composition, creativity and interesting subjects. Sharing the results on twitter or facebook is a breeze. Highly recommended.