Wednesday, 8 February 2012

Slow compile times in C++

I just thought I would put a short post on this, to do the world a favour.. probably not relevant to most people but c++ programmers might find this post on google.

Is your c++ project taking ages to compile? Do you have a brand spanking new machine but it takes tens of minutes to compile your projects?

This problem is a pet hate of mine, and something that (should) become apparent pretty quickly when working on big projects / using third party sdks. However I'm constantly surprised at the amount of people who are oblivious to this problem, or perhaps can't be bothered to change their working practices to take account of it.

The reason for slow compilation is, in 99% of cases, not due to the size of your project, or the number of files (maybe there's some freudian reason why some guys might like to impress with the size of their project by the slow compilation). No. Most half way modern computers can zoom through opening and processing 100s, or 1000s of files.

The problem is due to something called NESTED INCLUDES.

You know those statements you use in your code: #include

#include in a .cpp file : GOOD
#include in a .h file : BAD

The reason for slow compilation is for each .cpp file there is a chain of dependencies. In order to compile that .cpp file, it has to open and look through EVERY header file that is referenced by it, either directly OR INDIRECTLY, through massive long chains of header files including other header files that include other header files that reference other header files that contain things that aren't even needed for the original header file etc etc ad infinitum!!!

You can actually view this in some compilers, I think there is a switch /showincludes that may be available.

A lot of people simply give up, and use the compilers best bodge to rectify the situation: precompiled headers. While precompiled headers can get around a lot of the problems, particularly in third party code, they don't work for your own code, unless you put loads of your own headers in them, in which case it has to recreate the precompiled data everytime your change them, which kind of defeats the object.

A better way is to understand why the problem exists, and come up with slightly altered coding techniques that avoid it almost completely.


For fast compilation the name of the game is to STOP THE CHAIN of header includes, as soon as possible down the dependency line.

This means, absolutely not having #includes in header files unless absolutely necessary. And, where necessary where you must #include in a header file, make sure to #include something that doesn't itself have lots of dependencies.

IN YOUR OWN CODE

Instead of #including in a header file, as much as possible use FORWARD DECLARATION instead.

For instance, consider the example class CChicken:

class CChicken
{
public:
CLegs * m_pLegs;
CWings * m_pWings;
CHead * m_pHead;
};

You may be thinking 'It doesn't know what CLegs, CWings and CHead are! I need to #include them!!'. If you did you would be wrong. You don't need to #include them, you just need to forward declare them. Add this where you would normally put your #includes:

class CLegs;
class CWings;
class CHead;

class CChicken
{
public:
CLegs * m_pLegs;
CWings * m_pWings;
CHead * m_pHead;
};

It will then compile happily. Then when you come to actually USE the classes in the cpp file, put the #includes in the cpp file, NOT the header! :) The compiler actually doesn't need to know the definition of the class until it needs to know how to use it. If you are just using pointers to the class in a header file, it doesn't need to have the definition, so forward declaration is sufficient.

In this example, we have saved the processing of 3 additional files every time CChicken is compiled, and also EVERY TIME that CChicken.h is included in another place in the code. That is, we have (hugely simplified) speeded up compilation by 4x.

Now consider if CWings.h #included other header files itself! And if the h file included by CWings itself includes more header files. Quickly the chain can become astronomically large.

But but using foward declaration we can break this chain.

Sometimes however you can't use forward declaration, you need to define an object as part of your class, rather than just a pointer. In this case, make sure as best you can that you are using objects with header files that don't include lots of others. i.e. The dependency chain is as short as possible.

THIRD PARTY LIBRARIES

Within your own code it's possible to have total control, but what I hear you say, happens when using third party libraries? What if it's a third party 3d engine, or god forbid, WINDOWS?

This normally means the third party library is needed for #defines, or class definitions.

Needless header file including doesn't come much worse than when it's used to access simple #defines. For instance microsoft windows makes heavy use of DWORDs. I've actually seen (I kid you not) people have header files of the form:

#include <windows.h>

class MyClass
{
public:
DWORD m_dwValue;
};

NO NO NOOOOOOOO!!!!!
This kind of thing makes me want to go outside, wait for sunrise, then fall on my sword.

Firstly, use browse info to look up what a DWORD actually is. It's not some super dooper class, it's just ultimately an unsigned int.

So your file could be replaced by:

class MyClass
{
public:
unsigned int m_uiValue;
};

If you've ever #included <windows.h> you'll have an idea of how much time this will save.

Another trick is to create your own SIMPLE header file, duplicating the common defines that you need to access in the third party bloatware. That way you can refer to them by the same name, just by including one header instead of a myriad.

Other simple classes such as RECT, or a Vector3 :
DON'T include huge bloated stuff like windows.h just to use small classes like RECT.
Instead, create your own binary compatible class (with no dependencies in the .h file) and include THAT instead. Then when you want to pass it to windows (or 3rd party code), just CAST it to the type the compiler wants.

One gotcha is to make sure it really is binary compatible. Usually this is simple, but in some special cases there can be alignment issues (see #pragma pack documentation). This doesn't crop up often but is worth bearing in mind.

While this speeds up things using 3rd party libaries a lot .. there is another tip that can be very useful, as 3rd party libraries are often written by retards, who have no understanding of the concept of compilation time.

In this scenario, you want to call the 3rd party library from your cpp files. There may be lots of your cpp files that want to use the library, and each one ends up doing the #include for it, and taking ages to compile.

Instead of doing this, consider writing an 'interface' set of functions, or class, that offers a translation layer where you call them through your own simple header file, and the interface cpp file is the only one that has to talk to the third party library.

This means that instead of #including the dog slow third party library umpteen times throughout your own code and paying the penalty multiple times, you consolidate the penalty into ONE cpp file, which only needs to compile once, and once done, the functions are available to the rest of your app, almost for free in terms of penalty.

The added benefit is, that while you are writing code, you should be constantly compiling to check for errors etc and debugging. Now, instead of having to wait 20 seconds for your file to compile, you only wait 1 second because it doesn't have to #include the 3rd party library every time, only your simple interface to it.

Of course, this is duplicating to some extent an actual use for precompiled headers. However, using an interface can be preferable for two reasons: Firstly, you don't have to worry about setting up precompiled headers, and all those include stdafxs throughout your code. And secondly, the use of interfaces is good practice, if for nothing else than because it allows you to update your app to use a new version of a third party library with the minimum of change to your codebase.

Anyway, this concludes my recommendations on the subject. Of course there will be exceptions, there are always trade offs, sometimes between compile time and runtime speed. But if you try and use these recommendations wherever possible it will greatly speed up your compile time, and hence your productivity.

Tuesday, 14 June 2011

Pseudoscience and Bodybuilding

Haven't posted in a while (moved house etc) but today I felt the need to have a rant about one of my pet interests, building muscle. I should say from the outset, that from my own perspective this is about being a little less skinny rather than the world of competitive bodybuilding, however most people agree the principles should be the same.

Having a scientific background, I have to say that I have rarely come across subjects that have been more prone to junk-science and hearsay than that of exercise and its effects on the human body.

Ever since our ancestors jumped down from the trees it seems men and women have been exercising to 'better' their physiques. Today most cultures identify an ideal male physique as having appreciable muscle mass, and low body fat. Female 'ideal' physiques are often portrayed as slim, and low in fat (not usually to the extent of males). There are cultural variations of course, particularly in the cases where body fat is seen as a sign of wealth and good food supply.

'ideal' spartan physiques

Anyway, with all this time available, and people exercising at least since the times of the ancient greeks, you would have thought we would have come upon some general principles as to the best way to change our bodies through exercise and diet. Perhaps the most telling thing in exercise 'science' is that we have NOT come up with undisputed easily testable methods for changing our physiques.

This alone should tell us something : Perhaps because of the multitude of factors involved in an exercise program (diet, rest, training frequency and type) it is hard to pin down what factors or combinations of factors are responsible for success. Or, alternatively, perhaps once a successful formula has been found, the body quickly adapts to it and ceases to respond. Another possible alternative: perhaps which formula will be successful depends on the prior exercise regime (whatever the body is used to already).

The 'elephant in the room' with exercise science is that there are large numbers of people trying their hardest but making no significant progress. If anyone had 'pinned down' the perfect formula, you would think the idea would spread and everyone would be using it, except for those experimenting.

The multitude of factors undoubtedly involved in the success or failure of a program make it hard to conduct and draw conclusions from scientific experiments. Usually in science you try to keep all factors the same except one, vary this and then try and interpret the result. In bodybuilding the result should be muscle size (increase or stay same or decrease). This in itself can be hard to measure, and instead many studies measure 'strength', which is a bit vague and makes things hard to interpret. Unfortunately aside from lab rats, it is difficult to control for factors such as subjects diet, sleep, hormones, existing adaptation to exercise, let alone the problems with standardizing the 'effort' involved in the exercise program under test.

a lab rat

All this difficulty in devising useful valid scientific studies has led to an explosion in pseudoscience, and conclusions based on belief rather than solid science. For instance, often one influential person (say a Mr Olympia or something, e.g. Mike Mentzer or Arnold) will write about what they believe worked for them at some particular time, and many people will believe this with an almost religious-fervour.

However, this is highly unscientific. For a start, the sample size is often 1, which is not very scientific, and it makes it hard to pin down the reasons for success or gains. Was it really 'high intensity' training that was responsible for gains? Or was the individual eating more calories, or using a successful combination of steroids at the same time? It is hard to say without doing a proper replicated study.

This also brings up another of the problems with the pseudoscience going around. The ideas that get circulated around the most tend to come from the biggest guys, the guys with the biggest muscles. Almost by definition, these guys will also be taking numerous chemical aids (steroids, and now insulin, growth hormone etc). Not to mention they are also likely to have unusually good genes for muscle gain (low levels of myostatin for example). And what works under these particular hormonal environments may be completely different to what works for joe average training in his local gym with no chemicals to help. In fact the top guys regimes may be equally likely to destroy joe average body rather than build it up.


On the other side of the coin, many people preach the opposite, that 'everybody is different', and that we are all unique. Well yes we are, but it seems likely that muscle is pretty similar in all people aside from a few standard variations (relative numbers of fiber types, a few variations in proteins), and of course the hormone environment. In fact as far as I know skeletal muscle is pretty similar in different animals, so there is no reason to believe it should vary vastly in different people.

The truth probably lies somewhere in between the two schools of thought.


In my opinion, one of the biggest reasons to be sceptical of applying advise from steroid etc assisted subjects to 'normal people' is that the process of training and building muscle seems to be a delicate balance in the muscle between catabolism (breaking down) and anabolism (building up). The process of training itself seems to be catabolic, to different extents depending on the intensity / volume etc of the workout, so the anabolic stimulus of the workout has to be enough to regain what was lost just to break even! Steroids and other 'aids' may act by altering this balance. If they, hypothetically, turned off catabolism, but had no effect on anabolism, then the subject could train as hard and as often as they liked without having to worry about break down of the muscle. Only growth would result. Applying this same program to a natural trainer and the catabolism could easily overpower the anabolism.


How can we measure the muscle?

This seems to have caused enormous problems in studies and I am not clear why. Perhaps it is because the size of a muscle can vary with extraneous factors, such as perhaps hydration, glycogen contents etc. Strength (1 rep max) has in many cases been used instead as an indirect measure. The problem with this is you are then studying how to increase strength rather than size, which is not necessarily the same thing (particularly over the short term that these studies take place over). Instead I propose simply measuring either the upper arm or thigh may be a better measure. I have not yet tried the thigh but the upper arm seems fairly consistent and easy to measure progress.


With any measurement it is important to standardize as much as possible. I have used the standard method of wrapping the tape measure round the arm at the widest point when the biceps is tensed. It is important to note this cannot be standardized at all times : particularly during the workout, the arm will fill with blood (the pump) and vary by around half an inch depending on the workout. Also the rest of the day after a workout it will often be reduced by 1/8 inch or so. In addition it also seems to increase by 1/16 to 1/8 inch immediately if taken upon waking in the morning. Other than these times it seems a reasonable measure.

As many different exercise groups indirectly exercise the arms, the best way to examine the effects of exercise on these muscles is probably to exercise them for a period to the exclusion of the rest of the body. And to maximize the changes observed and simplify things, it may also be a good idea to exercise the biceps and triceps in the same workout.

When following this protocol, I found that a good arm workout usually will add 1/16 to 1/8 inch to the arms. If you find after a couple of days the arm is the same size or smaller, then the exercise program is not working .. i.e. the catabolic effect is overriding any anabolic effect.

When you take measurements of this sort, you will also recognise that going without food can immediately result in a loss of size. If skipping breakfast can take away all your gains from a hard workout it teaches you to be consistent with your diet.

Finally, after all this am I going to recommend an exercise / diet program? To break with tradition in the exercise world I'm going to answer by saying

'I DON'T KNOW what the best exercise / diet program is'.

After all, it is the beginning of wisdom to say 'I don't know'. But I will say that if you take measurements such as I have described, you will be able to recognise when something you are doing is working, or not. After all, if it is not working, you need to change something: one definition of insanity is to do the same thing over and over again and expect different results.

Friday, 30 April 2010

UK General Election

And so we in the UK finally stand on the brink of another general election, on May 6th.

It's been a while since we had a change of government here, 1997 if I remember correctly, and most people were glad to just get rid of the old government (conservatives), never mind who got in. That seems to be the fashion of things, we let a party have a period in power, then when they get too complacent and over-pompous they get the boot.

It has been an interesting time in politics these past decades .. with 'new labour' there has been a switch from a party following an ideology and set of principles, to the system of 'focus groups'. i.e. Governments now try and find out what voters want, in key marginal seats, and do that, in order to increase their share of the vote, rather having integrity and following a coherent 'plan'.

We now very much get the impression of parties not offering any kind of 'vision', but just offering us what we want to hear (depending on who they are talking to). Of course then they often do something completely different. The aim of the public face is to win votes, the actual agenda may be completely different.

National Debt

In the UK currently, we are in a situation where the country has built up a considerable national debt, created in no small part by Labour's policy which is to spend, spend spend.

Spending money is great for getting votes. Who would deny that it is nice to have lots of money spent on libraries, schools, police, health, support for the unemployed, etc etc etc. As well as us all indirectly benefitting from these things, many of us benefit directly by being employed in the public sector to do these jobs.

This is all well and good, but forgets that someone has to foot the bill for these public services. Common sense would suggest that a government should balance its books each year, using the money it takes from taxes to pay for spending.

However, it can be difficult to get the sums exactly right, plus for example it is expensive to lay people off in the public sector, then suddenly reemploy them again when the financial situation improves. For this reason it is argued that it is better for the government to have a 'credit card', so that if it spends a bit much in any particular year, it is not the end of the world, and it will pay it back on average over a few years.

This system is great, as with personal credit, until it is abused.

Some people, and some governments, simply cannot be trusted with credit. Such people are addicted to spending, and cannot control their own behaviour. At some point, the creditors have to step in, and say 'enough is enough', and forcibly take back their money, bankrupting the individual, company or government.

(As an aside, a better system than credit, is for the individual to build up a buffer of money. That way any particular bad year it can work into the buffer rather than relying on outside institutions for borrowing. Any individual that is able to balance it's books should be able to build up a buffer. If they can't, they don't possess the necessary financial skills to handle being in debt, which is kind of ironic, as the people most likely to fall into debt are those least able to handle it.)

Many of us here are worried that the current labour government has fallen into this spending trap. The most morally frightning thing is that a government knows that if things get really bad, another party will get voted in, so there is never any pressing need to clean up their own financial 'mess'. And the blame will of course (wrongly) be partly taken up by whoever steps in to clean up the mess.

This vicious circle is fed by the way that government spending can be used to 'buy' votes. If for example, a government borrows 1000 pounds from investors, and gives it to voters, many will think, 'Hey this government is great, they are doing something for me! I'll vote for them'.

If there is no 'higher police' to say, no you can't do this, then it is down to the moral integrity of a government (or rather whether they think they can get away with it) whether they indulge in such practices.

The danger currently is that labour seems to have fallen into this spending addiction trap. It has spent a vast amount, to build up it's voting base. And it simply cannot afford to stop spending, as it would lose voters. It would rather drive britain into a sovereign debt situation (bankrupt the country) rather than loosen it's grip on power. It is addicted to power, power is its raison d'etre.

The conservatives in particular can see the problem, the 'elephant in the room', which is the huge difference between what the country is earning (in taxes) and what it is spending. In order to have any hope of addressing the deficit, any government is either going to have to increase tax takings, or reduce spending drastically, or probably a combination of the two.

Brown is currently living in la la land closing his eyes and ears and saying 'I can't hear you' to the financial markets, but the day of reckoning will come, as it is currently to greece, who are facing the prospect of financial rescue or even being thrown out of the eurozone.

In a sense, brown doesn't care, as long as his party does well in the next election. They seem to want to continue having power at any cost, rather than caring about what is best for the country.

Brown's solution for every problem seems to be to spend spend spend more, however this ignores the problem that this wealth has to come from somewhere. If you want to spend more you either have to increase tax rates, or stimulate the wealth creating businesses (private sector) so that they are being more successful, turning higher profits and thus paying more tax (on the same percentage).

The conservative proposal (and philosophy) is to reduce taxes and make it as easy as possible for UK businesses and entrepeneurs to expand, create jobs, and make profit, which in turn increases tax revenues, overcompensating for any loss in the headline tax rate. This approach would be coupled by a dramatic slashing of government spending, and would seem to be the most likely chance we have of balancing the books, and having a hope of paying back the huge debts which gordon brown has built up.

Of course, cutting public spending is not a popular move. It will lead to job losses for public sector workers, and less 'bribes' for the electorate. But sometimes what the country needs is not what it wants (much like a spoilt child). In the same way reducing taxes for businesses can be unpopular, as they are seen as 'the rich', rather than the engines that drive the country (would you want your economic engines choked with friction, or running as efficiently as possible?). Brown capitalizes on this popular misconception by playing up the natural envy from the masses ('us' and 'them'), when counter-intuitively taxing business less is highly likely to be MORE beneficial to the masses than higher taxes.


Gordon brown also fears that each public sector worker laid off will then add to the unemployment line (and why he shakes his head and is in so much disagreement), which is probably true for a time. But the fact is, that public sector workers are NON-PRODUCTIVE WORKERS. They do not create wealth. Their roles, if useful at all, is to FACILITATE the lives of society as a whole, and ultimately the wealth creators. Any tax they pay is just recycled money that came out of public coffers anyway. They are also a convenient way of fiddling the figures so that your government looks more successful than they really are.

It is only the private sector workers that are actually creating UK wealth. While there is a benefit to having a public sector to facilitate wealth creation, its 'bang for the buck' decreases as it becomes more bloated. Unless a public sector worker is significantly helping the country, it is actually BETTER to have them unemployed and collecting benefits that to be employing them, as the cost of benefits will be lower than their salary, not to mention they will then have the opportunity to work in the wealth creation sector.

Anyway, what will happen in the election will be interesting. I feel most of the country will be slapping their heads saying 'oh god' if brown manages to stay in leadership. On the other hand, I feel that a conservative / lib dem coalition might not be such a bad thing, if they can combine some of their better ideas, and lose some of the 'pie in the sky' ideas. I fully support the lib dem idea of rewarding those those come off benefits, something which I have written about in an early blog post on the benefit culture.

Whether we get a government that will be good for the country remains to be seen. I am not sure that our current flavour of voting and democracy is up to the task of electing governments that are optimal for the country, but I will save that for another post.

Friday, 6 November 2009

Physics modelling of string instruments

It's been a while since my last post, but much real life stuff has been going on in the last year, and I haven't had the opportunity to do much programming.

Anyway onto todays topic .. after having spent a bit of time working on my sequencer / sampling / composing program, I thought it would be fun to have a go at little programs for synthesizing instrument sounds, either to import the results as samples into the composer program, or to develop a plugin architecture for instruments, similar to the steinberg VST instrument approach (but on a more basic level).

So it occurred to me that by using some simple physics models, it might be possible to get some interesting sounds from the complexity that often occurs in physics simulations. Obvious candidates seem to be a string (such as a guitar or piano) and a drum skin, the drum skin being a 2 dimensional version of the 1d string simulation.

I have a very vague idea of how the string system works, depending on the length of the string you can get a base oscillation with the wavelength of the string, then various overtones where multiple oscillations can fit into the length of the string.

After a quick google it (not surprisingly) seems several people have used the same approach, I found an interesting thesis on the subject by Balazs Bank, with some very impressive piano simulations, which got me interested!

Anyway I have had a quick go at doing a very simple simulation of a 1d string, then exporting the results as a wav file.

Each element of the string has a 1d position (amplitude) and 1d velocity, with the velocity altered according to the distance to each of the neighbouring elements (i.e. a lower neighbour pulls the element down, and vice versa). This velocity is used to determine changes to the position, and finally damping is applied to the velocity each iteration. Each end element of the string is clamped to 0.0 position and 0.0 velocity. A very simple model really, but enough to get some waveforms happening!

I really want to get things happening with a decent length string (hundreds of elements) but for now I've just been trying small numbers .. presumably the results depend on the parameters used for applying velocity, damping etc.

So here's some results:

3 elements (1 active element, as the ends of the string are clamped to 0.0)
5 elements
7 elements
21 elements

So far for 'listening' to the result I had just been outputting the amplitude of the middle element of the string. Instead I decided to take a slightly more realistic approach and sum the amplitudes of each element of the string:

21 elements (summed amplitudes)

This muffled off a lot of the higher frequencies and made a bit more of a realistic sound. Presumably in a real instrument the way different frequencies are muffled off or amplified can depend on things such as the shape and resonant frequencies of the box around the string (the wood around a violin for example).

Not really any super usable sounds yet, but quite promising for an hour or so's coding.

Thursday, 16 October 2008

How would I solve the global financial crisis?

I suppose everyone has their own ideas on this, and I'm not an economist, I'm probably being very naive but I thought I'd have a go.

On holiday I was reading plato's 'republic'. In here he describes some kind of 'ideal' communist society, where everyone is working for the common good, like worker drones in an ant colony. Lots of staples of communist regimes like censorship, indoctrination etc. Not really my idea of an ideal society. But it does make you think.

Personally I believe that there is a balance to be made between pure capitalism and free markets (as the USA strives towards) and pure socialism (where everything is done for the state, and their is no reward for enterprise). I am not sure where the exact best balance lies, but there are some clues. I saw the movie 'sicko' recently which also examines one aspect of this question. It examines the reality of a free market health care system in the USA versus the more socialist systems in e.g. the uk or france. Ok it vastly overrates the usefulness of our NHS (us brits think it's not up to much), but I'd take that any day over the system they have as portrayed in the US. Medical insurance is fine in theory, but if they really can refuse your care on a technicality then the whole system becomes quite sick really (as with the movie title).

So while I believe very strongly in free markets and having reward for enterprise, I also think certain services are best provided by the state - such as health, police, fire etc.

Currently with the bank crisis I find myself asking some similar questions about the banking system. If capitalist societies rely on banks as their backbones, the foundations upon which society is laid, then the banking system must be stable as a rock. As detailed in the previous post, there are some very serious and fundamental flaws with the current banking systems. While they work very well on sunny days, they really become worse than useless on rainy days.

For something so fundamental I have to propose 2 possible solutions.

One is that the state should run the fundamental banking system for a country (or even possibly the concept of a world bank?). Nationalization - the banks money IS the states money - there can be no more confidence than that I think. And if the state takes on toxic debts from e.g. the US, then it is it's own fault, it takes the hit, the tax payers complain and vote in a more prudent government next time.

The second solution, probably more likely as less radical .. is that the banks, as 'keystones' in the economies of the world, are subject to intense regulation in everything they do. Every loan, debt is recorded and audited and viewable (perhaps even by anybody?) so that their financial books are constantly under scrutiny, much like open source software. And have independent bodies (presumably this is what the FSA is meant to be for) scrutinizing the books and imposing vast penalties for taking on too much risk.

However, as one banking expert pointed out, the problem with the second solution is, that it is easy to spot a risky loan in hindsight, much more difficult to manage risk on a day to day basis. Mind you you can't help thinking that it must be possible to manage risk better than certain institutions were doing (e.g. northern rock).

It is interesting that the banking systems used to be far more highly regulated, but I believe that in the reagan and thatcher years, many of the regulations were removed, and more so by brown and his cronies. Perhaps they were the true architects of this crisis.

On top of all this, one also has to ask some questions about the whole fractional reserve banking 'con', that exacerbates the whole system in times of crisis. The question is, could we live without it? Or has it become the engine that drives our economies?

The Credit Crunch

A while since I posted, but I've been on holiday and since I came back I've been non stop having to deal with the symptoms of the financial crisis. It's certainly made me read and understand a lot more about economics. I'm still far from knowledgeable about it all.. but here are a few thoughts on the matter.

First to set the scenario:
Over the past year there has been increasing worry in the banking sector about the prescence of 'sub prime' loans in the system. In the USA in particular, the banks had been so eager to keep making profit they had loaned money to people to buy houses, where they had no hope of ever paying off the mortgage.

Instead of keeping this debt on their own books, these banks had packaged up these debts and sold them on to other international banks, while presumably downplaying the risk involved that the debtors would default.

Of course if the debtors defaulted, there was always the house left as collateral for the loan - i.e. the banks could reclaim the house, sell it on, and get back the money that was owed.

However, an added problem is that house prices in the US (and the UK) have been dropping. In many cases also it could prove nigh impossible to sell on the houses once the house buyer defaulted. This meant that there were an awful lot of 'sub prime' loans in the international banking system that were not worth a hell of a lot.

Of course the original home loaning banks didn't really care too much - they had sold the toxic debt on. After all, there seems to have been little regulation in this industry.

Throughout the last year worries about this problem have been spreading throughout the global banking system. Banks work by taking deposits from us, then investing most of that money themselves, by mechanisms such as providing home loans, or lending it on to other banks to invest, at an inter bank lending rate (LIBOR). They keep a little cash on hand, just in case some depositors come asking for their money back, but the vast majority is out there invested somehow, making the bank interest.

Now what has happened is that banks have become worried that their lending neighbour banks may have become contaminated with lots of worthless sub prime loans on their books. As far as I can see, the books of the banks are confidential, so there seems to be games of rumours and chinese whispers as to which banks have lots of their money tied up in these worthless loans.

The problem is, that if I lend money to a bank that has lots of toxic loans, and that bank runs into liquidity problems as a result of the toxic loans, then I'm not sure if I'll get my money back. Now consider that this applies for both me as a depositor, AND other banks in the inter bank lending market.

The banks really don't want to risk lending to each other because of this high risk, and thus the interbank lending rate (LIBOR) is sky high. However that means that the only way banks can make money and operate is through their own cash from depositors, and making loans etc themselves rather than via other banks.

This results in the banks having to offer very high interest rates because they are desperate for the cash from depositors, which effectively means that in the past few years they have become less and less tied to central bank (e.g. bank of england) lending rates, i.e. they are ignoring the moves that politicians make, because free market forces have taken over.

It also means that with this small amount of working capital the banks have to be VERY careful about who they loan it to (to make profit), AND they will only loan it at high rates of interest, as they need to make money to survive.

This means many businesses (particularly small ones) will apply to their bank for a loan to operate, and be refused. Businesses are thus having to downsize, or go under, from lack of loan money, and thus a lot of people are going to lose their jobs. This job loss stage is just beginning. When people lose their jobs, or are worried about their jobs, they cut back their spending, thus lowering the money made by other businesses, leading to more redundancies etc. The cycle continues and we have a recession.

But wait!! It's even more complicated than that. There is an extra 'BONUS' risk. This can be quite tricky to understand, so I'll say it slowly:

Banks are basically more advanced versions of the 'money lenders' in the temples in bible stories etc. The idea is that if you are rich, you can either hold onto your current wealth, or you can make it grow even bigger by lending it out temporarily to other people, but charging them 'interest' or a percentage for the privilege of having this lending.

Of course if you are going to do this, you need some kind of mafia scenario, where you have enforcers to beat up your clients because many of them are very unreliable and will need 'persuasion' to pay back your loans.

I digress ... anyway this was the initial system rich people, lent out their money and got it back with interest. A few years later some bright sparks came up with the concept of a bank. Instead of having a rich guy provide the capital, a company (the bank) would build a big vault to ward off robbers, then offer citizens the ability to deposit their cash in the bank (to keep it safe).

The citizens were happy, they could keep their cash safe (or safer) than under their mattress, and the banks had capital, some of which they could lend out and charge interest to other businesses, home buyers etc.

The above is a simple banking system. There is however a problem even with this system. Because the bank has invested much of it's capital, if all the savers came to the bank at once and demanded their money back, they couldn't have it!! The bank would suffer a liquidity crisis (a technical way of saying they didn't have the cash) as it was tied up in loans to other people / businesses.

This scenario is called a 'run' on a bank. Providing people have confidence in their bank, then on average only a small percentage of savers will be asking for money out on any day .. matched approximately by other savers putting money in. In this way providing the bank keeps a reasonable amount of it's capital in cash form (not invested), it can stay solvent.

But wait, here's the mad bit. At some point along the line banks stopped using hard cash e.g. coins etc to lend, and started using in effect 'I owe you' notes, for lending and mortgages etc.

Then some incredibly bright spark(!) invented what is called fractional reserve banking. If a saver deposited say 100 pounds into their account at the bank, the bank would (theoretically) have 100 pounds it could then invest and lend out to e.g. a homebuyer somewhere. This is very logical.

One day the bank managers met up with themselves, and decided, they were reliable fellows - why not increase their potential to make profit, by allowing themselves to lend out MORE money than they had in deposits!! i.e. when a depositor gave them 100 pounds, this wasn't going to make them much interest on investments, so instead they would invest that 100, but also conjure up 900 from thin air, and invest that too!!

After all these notes they were issuing for mortages etc were only IOUs, they could write anything they wanted on them. And they were reliable sorts these bankers, providing everyone paid their loans back, then they would make 10x the profit, and no one would suspect a thing!!

So fractional reserve banking is kind of like a con trick, except it has become accepted as the conventional way of doing banking. That is because, in most cases, it works ... it applies 'leverage' and makes 10x the profit from the same amount of depositors money.

However this con can multiply the problems caused when a bank runs into trouble. In normal businesses, when they run into trouble, they go into administration, and the administrators split up the assets of the company that are left, sell them off and split the money left among the people that the company owes money.

But with a bank, most of the debts that the bank has, it made with IOUs, they were never backed by real money!! That means when a bank goes under, in many cases the IOUs will become almost worthless. This means that in a risky climate, banks are incredibly risky things to lend your money too.

And this is what is happening, the whole banking network is built on the fraction reserve 'con' trick, so the banks are incredibly wary of lending money to each other just in case one of them goes down. And if one goes down any banks that have lent money to it will suddenly find themselves a lot worse off, and in a position where they could go down. And then again any banks which rely on this second bank get taken down, and the cycle continues. The danger is that the whole banking system can fall over like a stack of dominos.

This is essentially as I understand it what happened in the wall street crash of 1929 and the following depression of the 1930s. Very large numbers of banks collapsed, millions of people lost their savings, and lots of businesses went under and their was massive unemployment.

The US government at the time believed strongly in the free market capitalist system 'to the end', and thus didn't provide any help when this situation occurred. It saw it as the 'weak banks' being taken out leaving only the fittest still standing.

Of course it doesn't actually always work like that. And they neglected to realise that the knock on effect of this would be a collapse of the rest of the economy, as everything in capitalism depends on the banking system, ie. banking is the backbone on which everything else lays. If the banking system goes, your whole system of society is at risk (you can end up with anarchy, everyone for themselves).

This time round the scenario is very similar. Most people have been unaware of the risks involved here, they are too busy watching 'big brother', or seeing what madonna or kylie are up to, they aren't 'interested' in financial matters. It reminds me of that scene in constantine, where keanu asks the woman 'do you believe in the devil?', 'no' she replies. 'Well he believes in you!'. It doesn't matter whether people have any interest in the financial system, they still are wholely dependent on it for practically everything in their lives.

This time round most casual observers make the same comments and mistakes that were made in the 1929 crash. 'It's the banks fault, let them go down'. Of course the stack of dominos would result, and the world could fall into the abyss. Quite frightening that these are also voters.

Luckily those making the decisions (well some of them) are a bit more versed in the hazards and the knock on effects. We stand on the edge of the precipice. As far as the governments are concerned they want to maintain the status quo. On the surface the problem is one of confidence. They want to restore confidence. Confidence on the one hand to depositors, in order to prevent runs on the banks. And confidence on the other hand to the banks so they will lend to each other, and hence make them more able to provide loans to businesses and homeowners that keep the economies of the world ticking.

The latest plan used by gordon brown, alistair darling, and now being followed to some extent in many countries, is to address these problems by providing capital (to prevent liquidity problems) and to provide guarantees to inter bank lending, to get the banks lending to each other. It is in effect trying to provide a giant band aid to the current banking structure / status quo.

Of course, because of the fractional reserve banking system, the figures involved are enormous, but hey the tax payers have no choice, they elected their governments... Besides it's just going on the countries own debts (they each seem to have made some kind of international 'tab', another con perhaps?). And the argument is that if it works, it's only a loan because it's a 'guarantee' and insurance against the bank lending, everything should work smoothly. Shouldn't it?

Well now we are beginning to see the signs. The plan was accounced around a week ago in the UK, and rather more recently in other countries. The FTSE / dow jones etc all jumped on the news of the global 'bailout' for the banks. But now it is falling again. People are beginning to realise that the problems of trust are more endemic and are being very hard to solve .. they will probably take years to return to normal levels of trust (and I doubt they will without some kind of modifications to current systems).

The inter bank lending rate (LIBOR) in the uk at least hasn't responded as gordon brown / darling would have hoped. In short they still aren't lending. We are still on the edge of precipice. And what's more, many of the governments have 'shot their load'. They don't have infinite finance. They can't carry on pumping billions and trillions to prop up the banking system indefinitely. And the bad debts of lehman brothers are going to be looked at shortly. What other institutions might go down as a result of this? What will be the knock on effects of several european banks going down, in iceland, in britain, in france, in germany etc. What were the interlinks? Will the stack of dominos start to fall?

Interesting times!

Wednesday, 30 July 2008

Knowledge Storage

I'm back at work on Egor now.

While the old version handled sentences such as 'what is a cat?', I want to now extend this fully to WH questions (what, why, where etc):

Thus questions such as 'where do you live?'. The old system was a bit of a bodge. Now, when a question such as this is formulated, it adds an entry for the 'WH-word' unknown into the knowledge tree... it can either identify the answer now or perhaps come up with the answer at a later time when it has more knowledge.

An interesting thing happens when you look at slightly more complex variants of these questions.

For instance:
------------------------------------
The cat eats sardines in the kitchen.
The cat eats mice in the garden.

Where does the cat eat sardines?
------------------------------------

Initially I was storing the information that the cat eats sardines, and the cat eats mice on separate branches (sub trees) from the subject. However, it occurred that reusing branches may be the way to go, both in terms of efficient compression of information, but also in terms of speedy and efficient access to the information.

However, once you start compressing the information, another 'issue' appears:

If you store, 'the cat eats sardines in the kitchen' in one tree, it essentially doesn't matter the order of the object and supplementary information...

i.e. the cat eats in the kitchen sardines = the cat eats sardines in the kitchen.

Once you start compressing several sentences of information in the same subtree, you then have to start considering the order of information.

Thus: The cat eats sardines in the kitchen, The cat eats tuna in the kitchen...

You may start to think of this as a hierarchy: cat -> eats -> in the kitchen -> sardines / tuna

However, this has many implications. Firstly you can no longer directly store information as generics (i.e. in tree terms the 'in the kitchen' needs to be distinct and have child nodes). This is an added level of complexity - so we would have to be sure we were getting a payback for that complexity.

In addition, once you start to consider several pieces of supplementary information for a sentence, the optimum storage arrangement may not be obvious (i.e. how are you going to regularly access this information determines the best tree structure).

As I am modelling things according to how biological systems tend to work .. there is also the point that biological systems often take the simplest path (making complexity from simple rules) rather than working with a complex 'operating system'. I.e. there is a danger of anthromorphosizing the problem - producing a computer science solution instead of a simpler (possible) biological solution.

I am not sure which one to go with at the moment, because it seems a major design issue. I may well start by experimenting with the simple approach. It may turn out to be incorrect (and later need a considerable rewrite), but the fact is that the whole project is a huge undertaking and I would rather have a simple system working than a more complex system that I didn't have nearly enough time to get to a working state.

In essence I can't hope to get everything perfectly right and optimal on my first attempts, I think this is something that will be refined in many decades to come, to one or several optimal solutions.