NextgenHeating

Sunday, February 20, 2011

10:10 and Microsoft

My 10:10 article earlier today got me thinking about another topic.  Last year I noticed that Microsoft had signed up for 10:10.  I admit to having had a little intellectual snigger at this as Microsoft must preside over one of the biggest carbon footprints on the planet in the form of its Windows operating system.
  
It's easy to knock Windows and I am certainly not the first person to do it.  I also understand that Windows is one of the biggest, if not the biggest software project ever undertaken.  As such that it works at all is a minor miracle . After all many such projects have simply collapsed under their own weight.  Neither am I a Linux bigot; I still use Windows for the simple reason that it is the closest thing there is to a standard in the PC space.   No, the problem with Windows is not that Microsoft is incompetent, far from it. The problem is that it's simply too old.


Windows was written when computer hardware was much less capable than it is today.  For example, It had to make use of virtual memory to support all of its features.  Now the [SDRAM] memory on my PC, at 8Gbyte is 20x as big as the hard disk of the first computer I ran windows on but I still cannot turn off the virtual memory feature.  As a result it sits there "strumming" the hard disk even when I do nothing.  Worse by far is the automatic update system.  Think of all the servers, the network infrastructure and the sheer number of PCs involved to keep the OS alive today.  And all, it seems, for nought.  My Windows 7 installation, though less than 6 months old, is fatally flawed and requires (according to my web research) a rebuild but I don't have the time or indeed the heart to do it. This is all a waste of energy, both figuratively and literally.


The Intel turn-around over the NetBurst (Pentium 4) architecture is now very well documented.  Pentium 4 CPUs, in their push for more and more MHz were not just wasteful of power, but were to such an extent that they were starting to push the thermal dissipation problem to its limits. At the time I would only consider AMD CPUs as I valued MIPs/watt over raw MIPs.  I applaud Intel's courage in making the turn around to achieve a 4 fold reduction in power consumption while achieving a similar increase in CPU power.   Indeed their 2011 "Sandy Bridge" architecture with its dedicated hardware acceleration is yet another step in the right direction.


So could Intel's success be replicated by Microsoft?  Could they make a Windows compatible operating system from the ground up?  In theory, Yes, though many of us still remember Longhorn!  However all incumbents have a disadvantage and Microsoft shows no signs of having the will to fix Windows.  Only now, for instance, are they talking of porting Windows to run on the ARM.  I suspect this is merely a reaction to the disruptive force of the I-pad and its derivatives.All this reminds me of a recent PC-Pro article in which Hermann Hauser "Believes that Intel's days are numbered" and that ARM will inevitably kill Intel.  I have a huge respect for Dr Hauser.  I have had the privilege to work with him three times, the first of which, coincidently, was at Acorn at the time the ARM CPU was designed.    One of Hermann's favourite stories was of a visit from Bill Gates, who was then trying to peddle MSDOS.  Hermann sent him away saying he had superior technology which, of course, he did.  However you have to ask yourself why is Bill Gates now a billionaire when Hermann is only a millionaire?  Indeed Acorn had local area networking in 1984, long before Ethernet was brought to market yet it failed to capitalise on its technical superiority. Hermann, then, is a clever man but is not a perfect Industry barometer.  As per my previous article today,  I do agree with Dr Hauser that we are entering a new wave of devices with "good enough" technology.  That these devices are battery powered is great news for our global carbon footprints as any OS that runs on them must have power consumption high on its priority list.  As a new medium there is no need to be compatible with legacy programs so there is no need for Microsoft's OS.  To my mind that makes porting Windows to tablets both futile and irrelevant.  I think the jury is out as to whether ARM will beat Intel.  It is, however, becoming increasingly clear that one way or another Microsoft's carbon footprint is about to get a lot smaller.

The 10:10 challenge?

10:10 is a movement started, it appears, by the team that made "The Age of Stupid".  It seemed like a great way of getting like minded people together to create momentum for change.  I joined immediately.  One year on I admit to being a little disillusioned by their lack-lustre success driven mostly it seems by their inability to make concrete recommendations as to how their members might lower their carbon footprint.  10% would seem to be a relatively easy goal but it does require more than just changing perfectly good light bulbs for low energy ones.

I was recently accused by one of the VCs I'm talking to of proposing what might be a "boil the ocean story" in respect of NextGen heating.  He clarified this by saying "it needs widespread shifts in people's behaviour and major capital outlays to see real adoption".  I hope I've shown through this blog that by simple changes - moving closer to where you work and cycling to work, sensible use of traffic control, insulation etc I have saved much more than 10% of my carbon footprint and, apart from being a little fitter and a more comfortable, did not make any particular sacrifices.  It did however take conscious effort and conscious action (and some failures) to achieve it.

One strong hint at a solution was neatly eluded to by Peter Hinssen,  one of the Vlerick Management School professors and an IT futurist, in a Keynote called "digital, the new normal".   There is a slide about 25% of the way through showing a survey into "The necessities of life" where 49% of respondents mentioned  cell phones and electronic gadgets, most of which did not exist 15 years ago, populated 50% of the top 10.  Surely if electronic "toys" can achieve this penetration in half a generation then there is hope for green-tech too?  His  ideas  on "Good enough Technology" really resonated with me. e.g. why people use Skype over land lines, MP3 over DVD audio and why Blue-Ray has not achieved mass market penetration.   

I don't believe we should be content with scratching the surface of green-tech. I don't believe we need boil ocean's to get where we need to be either. We just need to find good enough technologies that we can deploy and enjoy.

Monday, February 14, 2011

Latency Versus Bandwidth – what Jeremy Clarkson and David Mackay need to understand

I was given a Jeremy Clarkson book, “Driven to Distraction” for Christmas.  In the opening article he casts doubt over the wisdom of the variable speed limits around the M25,  calling it “a new state control system to quash individualism on the motorway”.  Much as I hate to criticize a fellow Doncasterian I can only conclude that he doesn’t understand the maths.

Back in 1987 I worked for a consultancy.  One of my co-workers was a software engineer engaged in writing fluid modelling programs.  He told me that up to a certain point  the flow in a water pipe is linear and it travels with very little resistance.   Above a critical velocity the flow becomes turbulent and the water molecules bump into each other such that the flow is no longer a linear function of the pressure exerted.  My co-worker also told me that the flow of traffic on a motorway could be modelled in a very similar way to water in a pipe.  So much for individualism!

Let us apply the latency versus bandwidth analysis I used in the last blog to this new mathematical problem; how to get around the M25 as quickly as possible. The latency is what Mr Clarkson perceives, it’s the journey time.  To measure the bandwidth of a motorway you need to draw a virtual line across the road and count how many cars per second, per minute or per hour that cross that line.  For instance if we all obeyed the 2 second rule there would be 3/2 cars per second (3 lanes * 2 seconds), which is 90 cars per minute, 5400 cars per hour, or 130,000 cars a day.  Actually that’s not a lot considering how many cars use the M25 every day but then they don’t all use the same bit of motorway.  Of course many of us don’t obey the 2 second rule so the bandwidth is actually higher than that.  The point is that the bandwidth is most definitely finite. Whats more, in a traffic jam the bandwidth of the motorway decreases dramatically!

So how do variable speed limits help? Well, it appears that the M25 traffic planners understand the difference between latency and bandwidth and a bit about fluid dynamics.  From my analysis above the peak bandwidth of the motorway is largely independent of traffic speed and by reducing the speed of the traffic they are more likely to achieve a non-turbulant flow which keeps the traffic moving.  In other words the traffic planners are optimising bandwidth, not latency.   However the more vehicles they can get past a given point, the more they reduce the average latency too.  Left to their own devices individual motorists might indeed achieve a lower latency, but because they reduce the bandwidth of the motorway the average latency actually increases.

From a fuel consumption point of view traffic jams are a worst case scenario.  Its simple mathematics that for a journey of X miles and consuming Y gallons the average MPG is X/Y.  Of course the fashion is to talk about litres per 100km which is just the other way around.  However standing still with the engine running is time at 0MPG or infinite l/100km.  In order words, time spent stationary or even in a gear below the top is fuel wasted.  Coming back to the linear versus turbulent flow analogy, if every vehicle travels 56mph a linear flow would be achieved and the overall fuel consumption would be minimised.  

However, I for one would not like to drive everywhere at 56mph.  There is also a carbon cost to me being on the planet so time wasted is also carbon spent.   The good news is that providing that a linear flow can be maintained the actual carbon cost of the higher speed is secondary.  That is, if everyone went exactly 70mph then that would still be lower entropy than some people doing 85mph and others doing 50mph.

This brings me neatly on to the second part of my article – road trains.  One way to increase the bandwidth of a motorway is to simply make more lanes.  This is both very expensive and very environmentally unfriendly.  The second and probably more fruitful way is to reduce the time distance between cars.  Imagine if there was a way to make a road train such that each car travels safely just 30cm behind each other, or, say, 5m apart.  For the simplicity of the mathematics lets say they are travelling at 120km/h (75mph) or 2km/minute, 33.3m/second.  The bandwidth of this motorway would be 20 cars/second or 13x the current safe bandwidth. Actually you don’t have to imagine road trains as they are a practical reality right now:

In David Mackay’s otherwise excellent book “Sustainability without the hot Air” he makes mention of several promising low carbon technologies but road trains are not mentioned.  That Dr Mackay is anti-car is quite evident.  However public transport, as alluded to in my earlier blogs is not a panacea and I doubt if people will be willing to give up on personal transport anytime soon.  Road trains represent a pragmatic solution that can reduce car journey times and keep traffic flowing, thus reducing the carbon footprint.   

Further, by getting the cars so close together one of the major contributions to fuel consumption, wind resistance, is dramatically reduced.  Anyone who watches motor racing understands slipstreaming.  Dr Mackay uses this precise argument to postulate why trains are more efficient than cars.

There will always be some people who want to go faster than others.  So let me put out a straw man for people – keep the motorways 3 lanes, make the inside line 60mph, the middle lane 70mph and the outside lane 80mph, and make road trains of 10-20 cars followed by a gap.   To change lane the driver indicates his or her intention and the on-board computers negotiate to find a gap and fit in the car.  As the relative speeds between lanes are slow this can be done quite easily.  Indeed with all that bandwidth there will be gaps in the trains. After all each lane has more bandwidth than the current motorway has.  Now Mr Clarkson and Dr Mackay, at opposite ends of the carbon-political spectrum are both happy :o). 

Wednesday, February 2, 2011

Bandwidth versus Latency, a common misconception

As an electronics engineer I came to understand the difference between bandwidth and latency.  Bandwidth is the amount of objects that can travel across a medium per unit time e.g. Megabytes per second from a hard disk (HDD).  Latency is the amount of time each object takes to get to its destination.  Taking again the example of a HDD this is usually governed by the seek time.  For those that don't know, the seek time is the time the hard disk takes to physically move the head across the platter of the magnetic disk to find the right track and then the time for the right part of the disk to spin under the head.  For a 7200RPM hard disk the data comes around every 60/7200 seconds or 8.3ms.

Consider a computer program trying to access a single byte of data from a disk.  Imagine we have a modern computer with a processor running at 3GHz.  This means (as an order of magnitude) it can process 3,000,000,000 instructions per second.  It issues a command to request the byte from the hard disk.  The processor in the hard disk electronics calculates where to move the disk head to, moves it and waits for the data to come under the head.  12 ms later the byte is handed over to the processor.  In this time the processor could have executed 3.6 million instructions but instead did nothing because it was waiting for the hard disk.  If programmers were foolish enough to write programs that randomly accessed many small pieces of data, you can imagine that this program would run extremely slowly no matter how fast the processor was.
 
Once the data is under the disk read head, it streams off the disk extremely quickly; at circa 100 Mbytes per second.  Imagine the computer program wanted to read 1 Mbyte instead of 1 Byte. Now we still wait 12ms for the first bit of data but then we get 1Mbyte of data in only 10ms.  The throughput of the program is increased hugely to 45Mbytes per second.  We see that the thing that we care about i.e. the work done, the throughput, is a function of both bandwidth and latency.

All very interesting I hear you say, but why publish all this on an eco-musing blog?  Well, there is another parameter that comes into play, and that is the effort expended per amount of work done.  The CPU in the above example did not do nothing for 12ms, it sat there "spinning its wheels", generating heat.  The rest of the computer is also alive; The hard disk itself spins its platters and moves its heads, the display lights its backlight etc.  Even in the second example the computer spends half of its time doing nothing but warm the planet.  This is why over the last year I have been steadily replacing the main hard disks of my computers with solid state HDDs (SSDs) aka flash disks.
 
SSDs have no moving parts which makes their seek times around 0.1ms; 2 orders of magnitudes quicker than traditional HDDs.  As they have no motors or moving parts they also use less power.  The only drawback is that they are expensive, or are they?  The machine I am typing on has 120Mbyte SSD and a 1Tbyte HDD.  The OS, Windows 7 and programs are on the SSD and all my data is on the HDD.  It turns out that Windows, being nearly 20 years old, was designed when computer memories were small and so is made up of lots of small files, 69,534 small files in my case, which is why it takes forever to boot. The boot time of this machine before I installed the SSD used to be about 2.5 minutes; now it is about 35 seconds. Given I use it daily that soon adds up to a lot of time; 700 minutes per year.  As a consultant I charge of the order of €1 per minute for my time, so that's a €2000 saving over the SSD's three year lifespan versus an initial outlay of €200.

It is not only my time during the latency of boot-up that is saved.  Toms Hardware found that computers with SSDs were on average 15% faster than their HDD counterparts.  As I work with computers about 9 hours a day, that's a potential saving of 1.35 hours daily.  Of course the bottleneck is not always the computer - after all, I can only type at a certain speed, but let's say it saves 2% of my time and 5% of the electricity used by the computer overall.  Over 3 years that equates to ~150kWh of electricity or €22 worth plus another €1200 in terms of my time. Another way to look at it is that the SSD pays for itself in about 3 months and saves carbon too!
 
Of course to do a full analysis of the carbon footprint I would have to calculate the embodied carbon in the manufacture of the SSD + HDD versus the just the HDD.  That might not look so good. But then what is the carbon cost of my time - heating, lighting, food, clothing etc?  That is, perhaps, going a little too far for this humble little blog.

Bandwidth and latency are interesting parameters to analyse.  They can also be applied to roads that allow objects i.e. you and me to travel from A to B. But that is for another blog  ... 

Friday, January 21, 2011

Paris, A broken city?

I’m sitting in Gare Du Nord, Paris on the Thalys train soon to set off for the return trip to Brussels. I took the train as it appeared to be quicker and cheaper than taking the car. I was wrong on both counts!


The train from Brussels to Paris is indeed very quick, whisking one from capital to capital in 1 hour 20 minutes. The problem starts once deposited on the platform at Gare Du Nord. I thought I had worked out some ongoing trains to get me close to my destination. I was first looking for “the blue line” (well it was blue on my copy of the plan). It also seemed to be called the RATP but I could see signs for neither. After 20 minutes and three cryptic clues from station staff later I found the RER (as I now know it) two floors down from the main station. I boarded the train but it continued to sit at the station for another 5 minutes. No problem I thought, only 2 stops on this train. It then ground into the next station and waited a further 10 minutes before moving on. I eventually arrived in "Notre Dame" about an hour after disembarking from the Thalys and now too late catch the last regional train to where I wanted to go.

I emerged from the underground station to find I really was outside Notre Dame cathedral and reluctantly hailed a cab to take me to my hotel at an additional cost of €30 and another 30 minutes. Total journey time from home to hotel was four and a half hours, about an hour more than I could have comfortably done it in the car.

The journey back was worse. The customer’s office, in the south west of Paris was at an intersection of the Peripherique and traffic starts to build at 4pm and doesn’t stop until 10pm. This was my original motivation for taking the train. I tried to call a taxi (actually several taxi firms) to take me to the local railway station but none of them wanted to come as the traffic was too busy. Eventually I had to jump on a bus heading in the general direction of central Paris.

At the bus terminus was an underground station but I'd been advised by the customer to take a taxi as it would be quicker and easier. My colleague had an earlier train than I so we took his advice – big mistake! I am sure the taxi driver took us for a ride, if you know what I mean; We took 45 minutes and €43 to get to the Gare Du Nord.

I am sure millions of people like to live in Paris, after all, why else would they do it? For me there is nothing so appealing about the place that I would be willing to spend my life crawling around in my car, or on dysfunctional train systems or buses. Every Parisian road is full to overflowing with cars travelling at little more than walking pace and belching out carbon dioxide. The general population seems to be resigned to the fact that their commute consumes over 10% of their waking hours. My colleague and I boggled at the waste of life and resources.

Brussels has problems but its transport system is streets ahead (pun intended) of Paris and its appeal just as great. Perhaps another case where small is beautiful?

Saturday, January 15, 2011

Heat 2010 – A lot of hot air?

At the end of last year I attended the HEAT2010 conference in Cambridge. This small, one day event organized by CIR is one of many they organize yearly on different Carbon technology topics. As I walked in from the snow outside, several exhibitors were showing off their technologies.
 
What looked like a gas boiler took my eye.  Indeed it was a gas boiler but with a difference.  Built by Genlec of Chester, it included a organic Rankin cycle electricity generator using a scroll compressor in reverse.  Thye claimed that in addition to the 10kW of heat, generated by the condensing gas boiler at 93% efficiency, the unit also generates 1kW of electricity.  The business model is interesting as, unlike the sterling engine based Whispergen, it is a standard form factor wall mounted boiler which would be easy to retrofit.  The price was said to be “about £800 more than a normal gas boiler installation”.  The prices they were using for gas and electricity respectively were 3p and 13p and I presume therefore that there is a saving of around 10p per kW.  The unit needs to run for 8000 hours to generate a payback. They claim that this translated to a payback period of ~4 years, which does seem reasonable.  Assuming a 4 month heating cycle (120 days * 4 = 480 days), the unit would have to be running an average of 16 hours per day.  That seems a little high but it is at least less than 24 ;o).  To be fair, if the boiler is also used for hot water then it will be on all year round. However, I would assume that your average eco-warrier would install solar water heating at the same time.  Nevertheless a 4 year payback is a no-brainer.

A less convincing technology was being demonstrated by a sister company, Vphase.  This company claimed to “fix the mains input to the house”.  When I asked what this meant I was told that "as the voltage into the house fluctuates, energy can be wasted".  The unit always regulates the incoming voltage down to 220V, which saves 10% compared to running appliances at 250V.   This is undoubtedly true but if it’s a Kettle then that extra 10% heats the water up more quickly and so stops 10% earlier so nothing is actually saved.  Worse, I’m guessing this unit is not 100% efficient in its conversion, so under these circumstances its actually worse than nothing?

Right next to this stand was another innovative company promoting low voltage DC distribution around the home.  They hijacked the domestic lighting circuit to do this.  On the face of it this is an excellent idea and one I’ve thought of myself.  Now that we are moving over to LED and/or compact fluorescent lighting the lighting load has gone down by a factor of 5-6.  We could therefore use the same wires but send 50V DC (1/5 of the voltage) without putting any extra load on the cable.  This 50V DC could easily be supplied by Photovoltaic cells with battery backup and indeed this company was including this in its offering.  However in my opinion they had gone one stage too far – with a protocol whereby things could ask for a specific voltage to be supplied and that that voltage would be supplied at high efficiency to the requesting device. IMHO they failed the KISS test.  I also asked them about regulatory compliance – i.e. had they talked to the people who wrote the building regulations to see what they thought of it.  The answer was that there was no need as the units were officially covered by the EU low voltage directive.  This of course is probably true, but how does your eco-warrier manage to sell his house after it has been doctored in such a way?

The next stand displayed some smart meters developed by Cambridge Design Partnership.  CDP's rapid development pitch was very impressive and the unit featured an interesting algorithm for looking at the profile of electricity usage to deduce (guess?) where the power was going.  This let to some interesting debate as to whether knowledge of what was being used where would lead to a change in usage patterns.  I tend to agree with the person who said “I know where all the petrol goes in my car, but it does not stop me driving and filling up once a week”.  IMHO Knowledge without control/choice is, unfortunately, a recipe for frustration and very little else.

So what about the conference itself?  The day was filled with interesting presentations (see here for the list) by numerous people. By the end of the day, however, I realized that most of the speakers were preaching to the converted.  They were all trying to sell their ideas to each other; there were no real customers to sell to.   The lack of real progress in the “fight against climate change” was mentioned several times and each time the frustration in peoples voices was apparent. What was also clear was that if I had not gone to this event then I would not have found out about these great pieces of technology and therein lies the problem; Unless we get the benefits in front of customers we will not save any carbon at all - it will all be just a lot of hot air! Scientist and engineers on their own won’t save the planet, though marketeers might?  

I would normally have finished this blog "on that bombshell" (as Jeremy Clarkson would say) but I just want the explore a plea from one of the speakers, interestingly the promoter of the micro-CHP mentioned above.  He believed that the best way to get Green technologies into market was to force them using government legislation.  I have some sympathy with this view.  After all, seat belts were made compulsory and saved countless lives as a result.  One generation on, seatbelts have been accepted by the vast majority. The UK building regulations have been progressively pushed in the direction of higher and higher energy efficiency but is this really the answer?  

Anyone who has seen “The Age of Stupid” has seen how resistant people can be to change.  People see loss before they see gain which is why there are so many NIMBYs and nay-sayers on the whole global warming issue.  I believe that to force people into spending money they don’t want to on benefits they cannot see will just aggravate them and make them even more resistant.  Better to educate them and present win-win solutions that will save both money and the planet.   The test that any technology needs to pass is will it be a win-win for its customers.  We don’t need technologies looking for a market, we need technologies that fit the market. Engineers - get out of your silo's and learn how to do marketing.