No talk Jan 2012
Monday 13 Feb 2012 Prof Eddie Wilson, Southampton Uni - : "Phantom Traffic
"We all know the experience - you are driving along the motorway when all
of a sudden, everything grinds to a halt. When you reach the front of the
queue, there is no accident, no debris - nothing to explain the hold-up.
So what happened?" Eddie Wilson is Professor of Modelling and Simulation in
Transportation Research Group at the University of Southampton. His
background is in Applied Mathematics and he is interested in the emergent
patterns of everyday human behaviour. His work on phantom traffic jams has
featured on TV programmes such as Cutting Edge and The One Show.
24 people, Q&A interspersed within the talk, about 1 and 1/4 hour
I'm in the transportation research group at Soton University . I will talk about the phenomenon,
that if you're a driver you will have encountered on the motorway. In fact I encountered this on the
You are bumbling along happily and you see red tail lights ahead of you, maybe hear a horn
, everything grinds to a halt - anothe rtraffic jam. I'm only talking in the context of motorways
in this talk. You queue for a few minutes in one of these phantom traffic jams ,
sometimes a complete stop, sometimes crawling along , suddenly you reach the front of
the queue and the road is clear in front, and you zoom off at top speed. About 6 minutes
later, scientifically established time, you will come across the same thing again.
A certain periodicity to things, certain days where you keep going through these
situations every 6 to 10 minutes , varying on conditions. The strange thing is that each
time you reach the front of these queues, there is nothing there, no roadworks , accidents or
debris or anything. At the front of some jams there is accident or debris , but phantom
traffic jams have nothing to explain them.
Sometimes called phantom traffic jams, stop and go waves and other terms.
So what causes these phantom jams and why don't you see what, if anything, caused
them. The answer, somewhat strange, is that the jam started far far in front of the point
that you encountered it. And then its travelled back along the road towards you
something like a wave - so think about ripples on a pond. They are periodic , just as
these stop and go waves of 10 minutes, propogating out . So you are driving forwards and
the waves are going backwards. As in a pub sounds like Guiness . Famously a glass of
Guiness , or other drinks with a nitrogen feed on them, the bubbles rise to the top
of the drink , but there is a pattern or modulation in the bubbles , which look like waves
So particles , or cars in this case, are moving in one direction but waves in those particles
are moving in the oppiosite direction. If you go onto Wikipedia and look this up,
people at Edinburgh Uni have done detailed high-speed videos of Guiness and discovered that
the bubbles are actually going down in the same direction of the waves. There is some strange convection
pattern where the bubbles go down on the outside of the glass and up in the middle.
However if the bubbles were going up and the waves going down then it would be exactly
llike phantom traffic jams .
Video from BBC TV program , motorway simulation. Somhow something has come to a halt
shown by the red taillights. Traffic is arriving at the back of the queue, the back of the queue
goes backwards but at the front of the queue vehicles feive away. In certain
circumstances they balance and you have a queue of standing traffic which although the traffic
it it is going forwards slowly , as a queue , it rolls backwards along the road.
Q: The queue is getting bigger ? people are arriving and departing at the same rate but the
queue is getting bigger?
If vehicles arrived at the back of the queue faster than they were leaving at the front then the
queue would get longer , still go backwards , but would get longer .
If the vehicles were arriving a tthe back slower than leaving at the front then the queue would
evaporate. Very often those 2 rates are completely in balance for reasons I will come to.
The queue sits there at constant length going backwards.
Are they common. Space-time maps / scatter maps , for all roads in the uk in 2011 and
you looked at the areas covered by phantom traffic jams , they are actually rather rare .
But we go on long journeys and particularly on bank holidays, you will come across them .
But in terms of toatal roads and times , not so common.
Where is the best place to see them. The M6 north bound , Birmingham up to the Lake
District. We've seen days in data where this entire 90 mile stretch is covered with stop and
go waves . We've seen individual jams that have travelled 50 or 60 miles, we can follow
them in the data. That is an individual queue that maybe half mile long , which has
persisted over the course of a whole day, rolling back 50 or 60 miles.
They occur over busy motorways but where do they start is the key thing. Usually
it starts at motorway junctions that are at over capacity . Other places too, wher ethere
is some kind of pinch , like a lane drop or rubber necking to an accident on the other
carriageway . Thats not necessarily where you see them, that is where they start.
If you see them at that point, then you see what has caused them.
Is this just a British thing. Afraid not , all over the western world, wherever we gathe rdata
you can see these things. And universally the speed of these waves c is -18 Km per hour ,
minus as its going backwards relative to the cars . I'm telling you this is a universal
constant , but almost a constant.
Q: It doesn't depend on the speed tha tthe traffic is moving?
No but it does depend on things like the composistion of the traffic , more trucks tends
to slow the wave down, lots of vehicles with cruise control and automatic transmissions
slow the waves , so in the USA thay tend to go markedly slower. We don;t understand
all the detail but we can observe them and correlate.
What do they look like . Standing by the side of the road is not a very effective way of collecting
traffic data. In practise we use the rectangles you see in the road surface at traffic lights.
Or diamond shaped , where it is squeezed to fit into the lane . The motorway system is covered
in these things . They are inductance detectors , a loop of wire in each rectangle.
When you hold a piece of metal , like a car, ove rthe detector it changes the inductance
of the circuit , which is measurable by some roadside electronics. Relies on metal and not
magnetism . With pairs of them you can get enough information in terms of timing , to
not only count cars but measure their speed. You can measure how long they are too.
On English road networks these things are typically every 500 metres and closer at
strategic points . We gets lots of data, so every 1 minute we get the count , flow,
average speed and things of this kind.
Q: Even if a vehicle is static over the loop, you get recognition of that?
If nothing is going over the loop , then the only thing you get is that the loop is covered .
So a bad data source for stationary traffic. You can detect it is stationary but you
have no other information. What you really need in that situation is a helicopter .
They capture time averages, like a single observer sitting and observing things going past.
But what you really want is a space average , as if you had an aerial position and observed
a whole section of road at once.
So some of my graphical data, my maps or atlas of data. They are space-time diagrams
, just physics . Colour is speed , picked off one lane of the motorway , the M42 active
management section at Birmingham. About the most monitored stretch of road in the
world with detectors about every 100 metres. Allows us to make pretty pictures.
One afternoon, bad day for a driver but an interesting day for pictures.
Time along the bottom, distance is the vertical axis, red is congestion .
J6 , if you zoom in, has all kinds of stripes . zoom in on one of these stripes and
that in data is a phantom traffic jam. Cars going forward in space and time and the jam
has the opposite slope as going backwards of about 18kph backwards.
So driving along fast, blue, you hit the back of the queue , queuing for a few minutes, reach the front of the
queue , nothing there and drive off. As you met this at J6 you would assume it was caused by
J6 but it could have been observed much further upstream than J6.
Why does the inflow match the outflow. There is a whole array of these things , periodic or
nearly periodic patterning. And it turns out that on busy days as you get a whole set of these
things , the outfall of one of these is a characteristic number , governed by behavioral
dynamics , such that if you have an array, except for the most extreme one is being fed by the outflow
of the previous one and thats why its in balance.
Why are they there. Its an idea of instability , a whole academic area about this , so disagreement
on this view I'm proposing. In the right/wrong situation, small fluctuations in a busy
traffic stream , somehow get magnified just by the nature of the traffic itself. Initiated by perhaps
individual vehicles changing lane. Something down at the scale of a few vehicles that gets
Is this traffic instability real. A TV program experiment , on the One Show. We got some
people to drive round in a circle . The film has been edited to select the run that shows
what we want it to. It had been done previously by some Japanese researchers so we
knew it was going to work, bu tthe drivers didn't know what was supposed to happen.
We did this 6 times and it happened on 3 occassions. Sooner or later somewhere on the
circle someone spontaneously makes a ddriving ewrror a bit too much and then suddenly there
ios a bunching. If you speed up the video a bit , the bunching is clearer, that is going backwards while
the vehicles are going forwards. Unfortunately although we did not tell the drivers what to
expect they soon got the idea and started changing their behaviour to stop it happening.
Q: What did they do to their behaviour ?
They started backing off, leaving larger gaps and slowing down. We told them to try keeping
a constant spacing and drive at least 15mph . The BBC signed-off on the insurance.
This is a large car park attached to the Toyota factory in Derbyshire.
Q: one behaviour I'm thinking of is the 2 second rule, where the vehicle in front of you
passes under a traffic gantry , you count 2 seconds and if shorter than 2 seconds you back
off , whatever your road speed is you keep 2 seconds . If all vehicles kept at 2 second
interval would you get phantom jams?
Probably not, but here is the thing , you;d also see areduction in motorway capacity
compared to what we have at the moment. Drivers would not be able tyo continously maintain
2 second gaps. At peak capacity of motorways per lane you see about 2200 vehicles
an hour, which translates to about 1.5 second gaps and thats the average. So there a lot of
people at something like 0.8 second gaps.
So backing off solves the problem but reduces the capacity.
Q: Does tailgating start the process off?
These vehicles in a circle are not going anywhere , no insentive to go faster . So trying to go faster
that situation allows is part of the whole behavioural problem. The sort of thing that
often kicks off in the fast lane on busy afternoons are often a separate thing from stop & go
waves. Usually evaporate rapidly or they result in an accident. They don't usually entrain the othe rlanes
They don't disappear easily once they get started unless there is some kind of behavioural
modification. Hysterisis is the kind of thing that we think is going on and that is a more
complicated theoretical picture than I've covered here.
Q: Did you work out your "c" for this driving in acircle experiment, for that situation and that speed?
I hadn't done so but 230 metres round, its about round. One way of looking at it
is say you have stationary traffic and vehicles arrive at 1500 vehicles an hour or 1 every 2 seconds.
The stationary vehicle you say occupies 8 or 9 metres, multiply those figures togethe r
and you get that sort of value of c.
There are some classical ideas around instability, back to the 1950s. Graphic of a
platoon where the leader is going along at constant speed , and remainder is going at the same
speed and I kick the second vehicle . So it fluctuates from its normal dynamics
and it soon entrains all the vehicles behind it in a similar motion. But the front vehicle
is still driving along at the same constant speed. That platoon instability keeps going
forever. Thats what people analysed for many years and they got it wrong.
Now we look at what is called string instability. Almost impossible to show a video
. The axes are turned round sso the people are driving up the page, the leader is
driving along at constant velocity , the second vehicle has this kick and fluctuation.
After a few seconds doing this they correct back to steady driving, because why shouldn't they as
the vehicle in fron is driving regularly. The next vehicle gets influenced by the driver in front
but their fluctuations are a little bit bigger and a bit longer . Sooner or later they too
will return to steady driving , again why not as the vehicle in fron has now returned to
regular speed. In this string theory you go down the column and each vehicle returns to
regular driving , the fluctuations get bigger and longer . What we have is a wave opening up
like an arrow or wedge so you only see these things if you are moving in the correct frame
in space-time. If I was sitting at one point in time or one point in space I'd just see
just a little packet of stuff going past me. This is what most of my mathematics
is built on . So a bunch of equations as simulation drivers. To analyse the instability
properties to show they can be explained as different sorts of waves etc. A whole game
in developing micro-simulations . Used by consultants to predict the effect of
road schemes etc. I do the basic science that underpins those.
Showing the simulation that is on
this internet java applet where you yourself can freely
change the parameters
Generate your own phantom traffic jams , rolling back, gaps and the periodic pattern.
Things aren't quantatively right in this simulation but they are qualitively right, and is the
kind of thing we are chasing the whole time. Understanding what you have to put into
these models to get them correct.
If you do this on a ring road, like my car park experiment at Toyota. If you put very
few cars in the ring then they whiz round at top speed and no problem. Go to the othe rextreme
and an enormous number of vehicles you get grid-lock. If this simulation was calibrated
properly then the interesting thing would be in the middle where you see stop and go waves.
In the real world highways you have to have some quite special conditions to trigger
thes ethings. Drop the inflow right off on this simulation then they don't start in
quite the right way.
Something I worked on recently published by the Royal Society , thinking about these
waves and classical ideas . Trajectory pictures, same space-time pictures as before
but now plotting the trajectories of cars , all at constant speed the lines should have a
nice constant slope . Parallel lines all at the same slope . Then a perterbation ,
say bad lane change , waves grow as a wedge spreading backwards down the road.
Because of the way wave models are put together, you're influenced by vehicles in
front of you , generally speaking and not vehicles behind you . That means these waves
have to go backwards relative to the column of vehicles. In practise perhaps you
do change your behaviour because someone is right behind you , but generally the
interest is wityh the vehicle in front of you. That means information waves have
to go backwards in any physical model. However its backwards relative to the vehicles
and the vehicles are driving forwards , so which wins. In data we know the waves go backwards
relative to the road, not just to the vehicles . So a whole calibration thing with the models
have to be respected . You could have the situation that the vehicles are travelling so fast that
the waves are carried forwards. Mathematically called convective instability and we need
to compute group velocities , signal velocities a novel analysis that no one had attempted
before io nthe context of road traffic theory. However there is a classic papers in 1910 and
1911 by Einstein and Sommerford who had done it in the context of light propogation.
Group and Signal velocities had been though the same or confused previously
by traffic theorists. We kind of rediscovered this in the context of traffic.
How to eliminate phantom jams. If you work for the Highways Agency you are never allowed
to say you can eliminate these things as that is completely impossible. Its always mitigate
or reduce , that kind of thing.
In the future, what used to be called controlled motorways on the M25 then active traffic
management on the M42 then managed motorways on many motorways.
The general idea is that you reduce the speed limit when things get busy and that is known to
stabilise traffic . We've never really understood at a behavioural leve, we can all guess at it
, apart from our own anecdotal experience. We try to increase the capacity of the traffic and one
way of doing that is to advise against lane changing , another is to make the cold shoulder
a running lane as traffic management in Birmingham. The other thing is once stop&go waves
are formed is to warn people about them as they are known to be very dangerous. You
don't want people screaming into thte back of standing traffic. So you may see
Caution : queue ahead. Then why you need all this simulation and modelling as a sign is not
very good at predicting . You see occassions when it predicts and no jam and then it
doesn't predict when there is a jam. Behind this is how we optimize and fully automatic
operation of signs. Because at the moment these systems need lots of people in control
The future - such as a Google car , not the kind that does mapping , 3D scanning Lidar on the
roof of it , there is a driver but does not have his hands on the wheel. A self-drive car
and claiming in a motorway context that the googlecar makes strategic driving decisions
better than a lot of drivers, maybe not as good as an advanced driver. But in terms of
lane positioning it is way ahead of humans. If we all had that then maybe traffic
would be more stable.
End of talk proper
Q: you say you need lots of people in control rooms , have you not got to the level where
a model can detect and respond that you slow down the flow.
The problem is the safety aspect . So for example on hard-shoulder opening , originaly
it was envisioned you could use automatic detection systems to determine that the
hard shoulder was empty before you open it. You can , most of the time, but most of
the time is not good enough. Now they do a full CCTV sweep , that means people
looking at those screens and there has to be a HATO ? that look like police cars that aren't
driving up and down the hard shoulder in adition before they open a hard shoulder.
Q: Are you saying it is automated but the machine is also monitored ?
The system called HIOK ? which detects stop and go waves and sets the immediate upstream
signal to lower limit and sets sign saying congestion ahead. That is a fully automatic system that
responds on the timescale of seconds. Partly why there are soi many false positives.
So you have a long vehicle , often in the middle of the night , drives over it rather slowly
it triggers the HIOK. Better a few false positives than a few true negatives.
The controlled motorway sytems are pre-emptive , looks at traffic volume and when it
gets above a certain critical number it actually reduces the speed limit before the problem
starts. Thats what most of these 50mph limits that you see are due to.
The computer generates prompts but it is someone in the control room who sees the prompts
and decides whether to flick a switch.
Q: Do we know if that reduction of the speed limit works or whether it is theoretical and
Statistically stop&go waves have reduced since theyt started doing it veven though traffic demannd has
gone up . Other factors have changed as well. The simplified theory behind it , used at the
outset has now been debunked. If you were trying to get increased flow then optimal flow is about
50 mph. And that's true if the limit is 70 mph but not true if the limit is 50mph. The whole curve
changes. They now think that if you set the limit at 50mph , anecdotal with some data,
that everyone wants to drive at about 60 . So with a 50 limit most people drive
at exactly 52 mph which is their computation of what they can get away with without
being done. What does help is that nearly everyone is driving at exactly the same speed.
If you cut out speed differences then its easier to absorb lane changes and also less
inclined to make lane changes so much.
Q: With electronic control in each car but not coupled together , would you still get this
same resonance and hysterisis effects?
Things like the Mercedes S class , not a lot of them, configured so locally stable
but you never would have enough of them for the string stability question to be
relevant. Same thing presumably inside the top end Passat.
Q: Would it take say 10 percent of such vehicles on the roads to have an effect overall. ?
I think its more like turning the numbers the other way round , certainly a majority.
Typically use radar and maintain a safe spacing with the vehicles around. At a local level, in
the platoon sense they are stable . They'd have to be or they'd be a terribble driving
experience. What would happen if they were all chained together has not been really
worked on as it would not realistically happen. But it would be a serious consideration
if everyone owned a googlecar.
Q: Car-flocking , where you can almost have them literally nose to tail and rely
on communication between the cars?
Not just sharing info with near neighbous but possibly transfering information
down the road in front of you. The issues are what is the latency on those wireless
communications . Latency could be similar to reaction times of humans .
What could happen if all connected then a vehicle encountering a problem 1 km upstream could send a message
to start braking now . You would all have to slow down but the idea is limiting it to the immediate
vicinity and not sparking off a stop&go.
2 things we do know is that these automatic systems can start to respond , ie braking
, much earlier than humans do . Second , if you don't have this area-wide information
say 2 or 3 vehicles ahead then this can stabilise things and calculations to back that up.
Q: You said this often occurs where junctions join the motorway, can anything be done
to regulate the flow onto the motorway?
Called ramp metering? , so a red traffic light on the on-ramp and periodically switch it
to green. Different ways of using this in different parts of the world. The way its used here is not
to limit but to regulate it. Only letting 2 or 3 vehicles on at a time, particularly HGVs.
If you limit to 3 vehicles at a time then nothing really bad kicks off. In the US its used
to limit flow. keep at red, let a queue build up , problem with governments then
as any queue that spills backwards from the on ramp into the local authority local roads then
serious issues of responsibility . Looking into the future there is pattern matching of
traffic and traffic shaping. The idea there that these lights would then know about densities of traffic
arriving upstream and so configured that say with a flock of lorries, a good time to keep the light red.
Q: Do we know if this 50mph limit is related to the slowest vehicles on the motorway?
or the reaction times of people?
Anecdotal , for me but I could perhaps dig out some data to back this up, part of a long list
of to-do things that I never get to do. I used to drive on the Wiltsshire M4 a lot and
Wiltshire police a few years ago announced they were going to enforce the 70mph limit
by camera vans on random bridges. Because they had had a lot of very big smashes.
Still motorways are relatively safe per km of all roads. The slow lane in the morning
would be lorries , the fast lane nearly 100 mph , middle 80 to 85mph. So no one travelling
at 70mph . Then people thought the most I can get away with is 80 unless
I see a camera van in advance, knowing where the bridges are that they can park .
The speed advantage over the lorries is only 20 mph , may as well start using the
left hand lane again so changing the capacity of the road.
Q: so if the slowest speed was 80 then the best speed to run everything would be 80?
Then the reaction time thing comes into play.
Q: Could we have something like the air traffic control system , filing flight plans
in advance , as we are all going to be connected via the internet ?
I've got a video of this . Fly out of Heathrow recently , terminal 5, there are driverless
vehicle system , little pods on concrete tubes , working exactly on this slot booking
system from air traffic control. The problem there is concerning scaleability.
You'd have to run synchronisitly on small network component , and have
places relaxed as you couldn't possibly centrally control something as big as the UK
with 30 million vehicles.
Q: When you see film of starlings , tens of thousands together, is it to their advantage
being able to move around in 3D . You do see patterning effects but not colliding . In free
flight you don't get obvious bands of jams . ?
There are models for this behaviour, by a Hungarian called Vicsek each of the agents looks at its
immediate circle of neighbours , looks at their velocities , what is their average
velocity , so directed speed, make that my own velocity . So they all line up locally .
Then another bit to it is I want a certain spacing from my neighbours and
that is like the physics of atom spacing .
Monday 19 Mar 2012 Dr Dennis Doerffel of REAP, Renewable Energy and Alternative
Propulsion Systems, Southampton, talk on technical aspects of electric
cars complete with a Li-ion powered mid-range performance-wise
production example in the car park to explore .
27 people, 3/4hour powerpoint+talk, 3/4 hour Q&A
In 1998 we bought our first electric vehicle, one seater, just interested with playing around
with the technology really. Discovering my interest in battery technology. I decided to go to the
UK in 2001 and a PhD here on energy management. All types of electric vehicles and
since 2003 we founded a company called REAP systems, Millbrook, Southampton where I'm the technical director.
Specialising in large lithium ion battery technology. Our aim is developing battery
management systems and also energy storage solutions. We have supported more than 150
projects worlwide since 2003.
I will talk not only abiout the technology of batteries but also the background and motivation
to go for electric vehicles because its a topic people like arguing about. The first cars were
actually electric , Edison and more than 100 years ago. Replaced by combustion engine cars
and a main reason behind that was due to another electric motor - the starter. Before that
they were not user friendly .
Cars are not just for transportation , to some its a means to freedom to go where you want
at the time you want without having to rely on public transport. Its also can be an
expression of fun or status , comfort and luxury. Some say they prefer one hour sitting
in the car than 1/2 hour on the tube in London . Its become part of living for many
many people. We have to bear all this in mind when we look at other alternatives.
These days we are aware of the impacts of cars and the resaources consumed by such cars
are impacting everyone.
Noise, local pollution, danger to pedestrians and cyclists, waste in manufacture and use
of cars and also when disposed of. They consume materials, CO2 is produced , oil and
energy in general is consumed and they need lots of space on the roads and for parking.
Its not just an energy problem.
Per capita consumption of gasoline graph for different countries.
For a very developed country like USA its high, in Europe room for growth but hopefully
a bit more efficient. A daunting look into the future is for countries with big potential
like Brazil, China and India is virtually not on the scale.
Existing OEM car manufacturers are always looking fo rnew markets and larger
production volumes . Building plants in China and India trying to sell they're cars to
them. Naturally people want to buy them. Not as efficient as the Europeans but at some point they
may reach the level of European use. You can guess where this is going.
Another argument often hearsd is that oil will never run out. That is true in my belief .
But that does not really matter, because what does matter is the so-called peak-oil.
That is the point where people see that consumption is maybe still going up but the
supply is starting to go down so the prices go up. We have pretty much reached that.
In mid 1990s they were predicting about 2005 so not too wrong. Because of dependency
on oil it may start one or another war etc - the scarey point, when people start to fight
about the last resources.
So what is the future , how are we going to fulfill our needs for transportation and
other needs that are attached to the car nowadays.
One way is to leave all our cars in the garage and use a bicycle, which would be sustainable
and no problem if you do it voluntarily. Forced to do it, older or disabled or you need the car
for other reasons , working away etc. Then it would not be so much fun.
So what will be the future, We have to find a car that fulfills all these criteria and more.
Energy consumption needs to be low, the car needs to be affordable, versatile .
A new car always has to have added value to it, compared to whats already on the market.
So no manufacturer normally brings out a model that has less features than the old
models. Needs to have good performance, low impact towards pedestrians and cyclists .
In principle there are 3 approaches to this.
Evolution - keep the combustion car but make it better
Migration- keep the combustion engine but start using different types of fuel
Radical change - no combustion car any more , electric or fuel-cell or whatever.
How to explain different technologies in a visual manner.
Set up as balance scales, conventional car - if you want to improve the impacts or
sustainability ,ie moving up, then the other side will probably come down, so costs
will get worse, versatility goes down and the feel-good factor goes down.
Buy a car with good fuel economy it is probably very small , not an exciting car etc.
This can be improved by improving the technology , eg 20 years ago a diesel car was something
boring , nowadays even Jaguar make diesel cars .
In the last 100 years, the rules of the market have driven the graphical left side up -
cheaper, cheaper, cheaper or more features, more power , more performance, better versatility,
better feel-good factors. But now global sustainability , drawn here as a magnet , we
must improve the left side. Ther graphical jack has little room to move it further out
so even putting a lot of effort into the technology we will not be able to fulfill both.
So evolution, just improving the combustion engione car will not be good enough.
What are the alternatives as people like to compare with electric vehicles.
The hybrid or electric car, renewable or sustainable fuels like ethanol or biodiesel .
The other topic is the fuel cell, so hydrogen and just water coming from the exhaust .
More human power or smaller vehicles.
Renewable fuels was the topic 2 years ago in the uk. The government thinking this is
how we make all the cars more sustainable. Biodiesel, alcohol, natural gas does not quite
count as not sustainable. Biogas would be quite sustainable , coal is an alternative
but not sustainable at all. wood pellets , in theory , I'm sure someone could figure out
how to use them in a car.
Can enough fuel be produced. What is the cost of this fuel. What are the overall emissions ,
so in the US generating ethanol , but if you need more coal? plantations to generate
the ethanol , than the ethanol is actually storing then there is no point.
So they have managed to have a negative efficiency. You still haven't done anything
on the noise impact of the cars, just a different type of fuel.
But the main reason why these fuels don't make any sense is how they capture
sunlight is pretty inefficient. Something like 4 percent efficiency. A plant is not
an efficient converter of sunlight into fuel. Need a lot of land area and that area
competes with food production and general living space. In a world where there
is still a lot of food poverty, this is not really sustainable.
So how much land would be required to satisfy the world energy consumption
, sinf plants as the converter. 223 times the UK land area which is 41 % of the world's
total land area . You could try seaweed . Or 24,000 square metres per person.
Would need to be planted with sugar cane plantations which is the more efficient way of
doing things, which we cannot grow here anyway. Here we would have to use rape seed
and that is even less efficient. these data are based on Brazil which is doing pretty well
converting sunlight into fuel - the country that does best.
Another option would be to use solar-steam plant , produce hydrogen which could be transported
across the world and used in combustion or fuel-cell cars. With that to supply the
world needs you only need an area the size of the UK and could be in areas where there are
no competing useage - eg in the middle of a desert. Sounds like a dream and a few countries
put a lot of money into this , but what was the problem. In a car normally you have fuel
which you need to combut to produce mechanical energy , for a series hybrid you produce
electrical energy , not very efficient. If you use a fuel cell you generate electricity
and run the vehicle on that. Still sounds like a dream, no local pollution , no noise, highly
efficient , little heat generated and the fuel can be stored with high energy density
so the range would be decent. The big problem is the technolgy was not ready and
isn't ready despite all the money and efforts , also infrastructure problems . How to distribute
hydrogen, its very expensive infrastructure and people in Germany folded the project
when they calculated how much it would cost the country. Building steam plant generators
would not be cheap either. It could be the solution , in the future but I think Germany, UK, and USA
decided it would not help in the short term.
So back to human power maybe. Would certainly solve all these problems but its not
everything that people are looking for. To buy a vehicle there are a number of other requirements and
inter-dependencies within them. eg to improve the feel-good factor for an electric
vehicle you need a big battery , powerful motor and so heavy and then noise goes up
energy consumption goes up and impacts would normally require the mass to be low.
Why are people talking aboout electric vehices, what is their key advantages also the
Slide showing for different speeds , how much power a car needs to run around.
Shown driving a small car up a 4 percent gradient, the maximum slope on UK motorways.
So at national sped limit of 120 kph you would only need about 35 KW and you
would not normally be driving up hills all the time, just a minority of the time.
So why much more, a car with less than 100KW is almost not a car nowadays,
so 3 times as much as you need. The combustion car produces that power in only a small region
, so high rpm and lots of noise. No one would use the car normally in that power region ,
low rpm feels better. The bigger the engine the better the feel-good factor unfortunately.
So if you take a big engine and you only need 35KW this plot shows the typical
efficiency of a combustion engine. Engine speed, engine torque and the different curves
show different regions of efficiency. The highest efficiency region outlined , relatively
low speed and high torque. Showing the effect of slow accelarating in first gear and quite a way
from high torque. Even in fifth gear you would be some way from the point of maximum efficiency.
This is for a small engine Toyota Prius , take a big engine it would be even worse.
An engine is usually running in a very low efficiency region.
Wheras an electric motor is slightly different. Torque and speed of the motor again ,
the area of highest efficiency is over 90 percent. Firstly it is quite high and secondly covers
quite a large range . This is the main reason people looked at the electric motor.
Comparison of cars, 1000Kg to just run around one person and consumption
maybe 43 mpg. A bicycle would bbe 15Kg so you can guess the sustainability
of these 2 solutions. In between perhaps a moped 100 Kg and perhaps 100 mpg.
Or make it electric and the fuel equivalent , in terms of kilowatt-hours is
380mpg which is purely because the electric motor is so much more efficient,
especially in urban traffic. Or there could be small electric vehicles that give low weight
and high economy. For urban traffic the electric vehicle makes more sense because
of small range, 80 percent of such journeys are less than 40 miles. So the car could
be small and so light and the battery could be small and so overall relatively cheap.
An example we own of a very light 2-seater, quite fast as well, 55 mph and could also pedal
and fully covered protection from the rain. Main problem with this sort of vehicle
is they are manufactured by very small companies at low volumes so the price is very high
. The reliability is low and the costs of getting it maintained to keep it n the road is
really high. You need to be a mechanic and an enthusiast basically to have one of these
So slightly bigger anothe rof our vehicles the Think , Ford invested a lot of money to get this vehicle on the
road. There is a movie called Who Killed the Electric Car , telling about the story.
A new movie called The Revenge of the Electic Car , we are planning to organise
a viewing of that , at some point, at the university. I would say that most of the problems are
not actually technical problems. Commercial problems, the urban runabout is what
makes most sense , the trouble is no one wants to spend any money on that sort of car.
How much are you prepared to spend?. £4000 maybe but if going above £10000 then
they say I could buy a proper car for that. Limited range, so can only run around cities
with it. There are a few manufacturers trying to push this technology . The Revenge film
shows what sort of risks these companies are taking. These are very expensive programs and
if people are not buying these cars then the CEO would definitely leave that company.
The pure electric vehicle compared with the combustion engine car it is easy to meet the
global sustainability because you can run it fully sustainable , producing the energy
yourself with solar panels , store it and put in your car, no CO2 apart from production , even
there you could look into zero C etc and full recyclability. The problem is how to keep
the cost right and keep a good feel-good factor high enough. The technology aspect is
still quite low and quite a lot can be done still to both of those without having to go for
one or the other. The versatility will always be relatively low and limited range so far.
Make the battery bigger and it becomes more and more expensive.
Battery technoilogy. When I started my PhD in 2001 the most common battery was the
lead-acid one . Graphed out watt-hours per litre , the energy you can store in a certain amount of volume
and watt-hours per Kg, how much energy you can store for a certain weight for a month.
Nickel Cadmium was around and people talking of Nickel-Zinc and Nickel Metal Hydride
. People started dreaming of Lithium Ion as characteristics better than lead-acid battery.
I was fortunate to be about the first in Europe to buy some lithium-ion
batteries. To put them in a vehicle and test them in 2002. Battery technology is making a
difference at the moment.
A slide for power you can get from batteries for a certain mass and energy for a certain mass
on a logarithmic scale. At the time lead-acid, lithium ion outside the goal of electric vehicle
. The fuel cell wouldn't meet it either , super capacitors wouldn't meet it but maybe a
combination of super capacitors and batteries might .
Comparison of battery size for 50 Km range , lead acid and lithium-ion about half the
size and a quarter of the weight. Li-ion has a higher cycle-life , you can deeply cycle it ,
kep it to any state of charge , it doesn't get damaged. Keep a lead-acid one at 20 percent
state of charge, it will not like it.
The range is always limited so from my point of view its important the driver knows what
sort of range he can get, will I make it. The system should be able to tell you , yes
you can get there and back . So to go from A to B , in theory its quite simple.
You know how much energy you've got, you know the sort of mpg , and if you cannnot
get the range it only takes a few minutes to refuel. With batteries its not that simple as unfortunately
, batteries have losses. Lead-acid batteries have quite a large leakage , and also the Nickel based batteries.
The battery self-discharges even if you don;t drive it . Come back from holiday and
you may be susrprised , also the power can deteriorate and may or may not come back
when you cycle it again. So batteries are complex systems, the performance depends
on the temperature, aging and treatments - how its been treated in the past.
So difficult to predict this electrochemistry. All the automotive guys I've met just cannot
talk to electro-chemists for some reason. They don;t want to understand how a battery
does not behave like a fuel tank which is one of the problems with electric vehicles.
The combustion engine consumption is so inefficient that it doesn't make that much
difference whether driving up or down hill , bit faster or bit slower. A diesel does 50,45
or 55 mpg , its not that big a diference. A motor being so much more efficient, your
driving style will make a big difference and the topography also and the speed etc.
If you wanted to predict the range with an electric vehicle , first you need to know how
much energy you have in your battery , how much you are going to consume fo r
that driver and that journey. Its not that easy.
You have to optimize the life of the batery . If you buy a mobile phone with a li-ion
battery , most people complain a year later that the battery doesn't last so long but
its not enough to make a big fuss about as people buy a new phone after 2 years or so.
With a car its different, the battery there may cost you £4000 upwards. That is for the Honda
Insight that we've got. You would want a system that looks after such a battery as well
as possible, especially if you buy a second-hand car. I've heard it from Tudor? Prius people
whether they should buy a second hand car as they have no knowledge of how good the battery is.
Obvious one big topic around batteries is the price, which comes down to the manufacturing, the
chemistry and the liability aspects. Another is what our company mainly works on which is
the battery management system . A slide of a real UK car with a dashboard covered in instruments
, fine if an enthusiast, not the solution for everyone. We are trying to put it all into
electronics, intelligent cell monitoring circuits , battery control in various shapes and
sizes and integrated into light weight and intelligent battery modules which can
be used in a variety of electric vehicles including electric motorcycles .
The good thing about li-ion is its high energy density and high power density but the
bad thing is it has a high energy density and a high power density. Which means that
if something goes wrong with it , it can catch fire and the fires are not very nice fires-
look it up on youtube. Its very importsant to look after the safety of the battery and
because the li-ion cells are so efficient , there is no overcharge reaction. As soon as the battery
is full you have to stop. This is true for every single cell in the battery , there could be 200
cells in the battery or even more. The BMS battery management system has to look after
each and all cells , voltage ,current and temperature. There are other safety issues , like
making sure there is no isolation fault between the high voltage of the battery and
the chassis so people can't get electrocuted , again comes under the BMS.
People looked into the hybrid system to combine the high energy density of fuel
and the high power density of an engine with the high sustainability in one
vehicle. To demonstrate the hybrid process on a slide as there is so many things that
can vary and can do that it looks like more like a hydraulic system . OEMs only want one
design and make millions of them and not hundreds of designs and only 10 of each.
Difficult to make the right sort of car for everyone. Hybrid consists of a large engine
a small motor and a small battery , basically a 12V battery. You can hybridise
from there by making the engine smaller which means you bring the efficiency up
and better feel-good whith a motor for better accelaration. Bigger motor means bigger
battery up to a pure electric vehicle. The EV has compromised/limited range
because a battery stores about 1/10th the energy equivalent of a fuel tank by volume.
So you have an engine or fuel-cell , if you do need the range. Typical series hybrid .
With only the generator and electric motor you have the electric continuously variable
transmission, not a very efficient drive-chain for cars.
Two main hybrid forms are the series hybrid of fuel tank, small combustion engine ,
generator, battery , motor - braking you can regenerate and store the energy back
in the battery and a gearbox. Not very efficient as many energy conversions.
If you allow this to be recharged from the mains and that is fully sustainable
ie solar or wind generated then you could run the vehicle purely electric
most of the time and use the efficient part of the drive chain and once or twice
a year for a long distance journey then the combustion engine would kick in.
So versatility, peace of mind , an electic car mainly and you have some reserve to go a bit further when you
The parallel hybrid vehicle that is outside in the car park is the 10 yearold Honda Insight .
Combustion engine, fuel cell, and battery and motor which can add aboost or regenerate
when braking. So you can make the engine smaller and so run more efficiently but still
have the acceleration that you'd expect from a car.
The most sensible would be small runarounds and electric but commercially it can be a
disaster even for large companies. Can cost hundreds of millions. The only car that
satisfies that criterion at the moment is the Nissan Leaf.
Or you can go the route that is not so environmentally acceptable but will introduce
a lot of the new technology , eg the Tesla, much as the Nissan but make everything bigger ,
the developement costs are the same , more production costs as more copper in the
motor etc. But then you can sell that sort of sports vehicle for much mor emoney, $100,000
or £90,000. A limited market, so if something goes wrong you only have to recall a few
thousand vehicles and not tens or hundreds of thousand. Profit margin higher,
liabilities lower and you can try out new technology wiht more confidence.
There are detractors as the world does not need this sort of vehicle but it is a way to
introduce new technologies . The latest technology never turned up in the VW Polo
first , it appeared fist in the Mercedes S Class or Jaguars. Even so no one knows
how long this sort of large battery lasts. But the sort of people who bought this car ,
don't much care how long the battery lasts , they can have some fun. They have
money to spend on such a vehicle , but environmentally its a disaster.
Do all the manufacturers use the same type of battery, if they agreed then the cost
of production would come down if common batteries?
I think all the large OEMs are all using li-ion and they have partnered with a few
large companies making the cells. They are not exactly the same but there are not that many
companies. I think there are less cell makers than car makers, so it is pooling.
But none of the actual individual batteries are common between car manufacturers. You
can't have a battery from one car and put it in another?
No, they're designs are always different. There is one company trying to push this
concept. To a better place, trying to have swappable batteries and all the same. I
think car manufacturers want they;re cars to be a bit individualistic/unique .
Smaller car, larger car and different range . To make an optimised battery for each
job requires quite a lot of design work. The packaging - how can you squeeze a lot
of energy into that small space. So a common battery would be too big or too small for some
and would not be space efficient. I don't see that coming for the moment, maybe in
the future if people become more desparate as cheap urban transport then it may come.
If every car in the wold was electric, would we have enough lithium?
I personally don't know. You don't need a lot of lithium for the li-ion
battery , the main materials are copper, aluminium and some form of carbon
/ graphite . The mining companies I've talked to have said there would not be a lithium
shortage for a very long time. Eventually there will be just as there is for any other material
that we waste at the moment. We make laptops and mobile phones and throw them away after half
Is it recycleable from the old batteries?
At the moment they are able to recycle the plastic, the copper , aluminium, don't seem to need
to recycle the carbon , but I don't think they are recycling the lithium.
Technically it would be possible?
Possible, I don't know. But it is quite a small quantity. I think the bigger problem , with
the mining companies, is that you can't scale up mining easily. If someone came along next
year wanting to make a million electric vehicles then the mining companies possibly would
not be able to get up to speed fast enough. They need years to predict their production
volumes . So problem with get the volume up and the price comes down , only works
, if the supply is large enough or the price goes up higher. so the big battery makers
Sanyo?, Panasonic, HE, SK Energy etc mainly in Japan and Korea have demands from other
industries . They have been making them for some time now, some profits and suddenly
car makers come along , they think they are very important and they deserve all the
attention and want to get really low prices. But they don't really talk to the manufacturer's in
the right tone. So they ended up with high prices, its one of the reasons.
Is there any battery technology on the horizon that may make a difference, such as graphene. ?
I know people are working on Lithium metal, lithium-air and there will always be
improvements. Personally, the technology available now can make sustainable transport
its for some other reason , we don;t just quite manage. Most of the problems I've seen
were commercial problems and not technical. Commercial problems, people didnt pay
attention to , early enough. There will be improvements that will make it more feasible
and more attractive , for the wider masses. There would be enough market now , and the
technology is there, to serve that market. There are some funny things going on.
If you look at 6 years back we worked with a few small electric vehicle makers , they had
problems with reliability and price but they were maxed-out at that time, they could not
produce enough electric vehicles for the demand. Then the big guys started talking about EVs
and the dsmall guys lost their interest assuming the big guys would do it right , they will be
fast into the market and we will be dead. So no investor would put any more money in ,
no bank loans and were running empty. Then the big guys never made it. That has been a
problem for a long time. We have an old VW golf electric from 1995 used for a long time,
been perfectly fine , even with lead-acid batteries. As soon as the big guys say they will
start then it puts off the few small guys to develop and invest , the risk is too high.
Whu didn't the big guys continue the developements from 1995 they would be good
now. They stopped when the government money ran out.
We supplied one project that in the end entailed a company to get a bail-out of 2
billion dollars , a lovely car. I was the first to drive it as I tuned the electronics. Great
car, hope it comes to market, they got their 2 billion and they stopped the programme.
Looking back I think perhaps we should have started making that sort of car because
you make 10 , 50 or maybe 100 , you can grow a company, based on that organically.
But because all the big boys at that time started developing and making things
we just upplied them and they will get it to the market but they never did.
Plus they never paid properly , saying, as supplier you would be interested in all the volume that
would come later, and the volume never comes. We are not interested these days if
someone comes to us these days stating big volumes.
Would it make more sense to turn the financial model on its head? as most of the
cost is in the batteries , if they adopted the same policy as Rolls Royce do with aero
engines , where they are effectively charging the airlines so much per mile flown ,
so effectively leasing the battery and you are charging with some electronics that
indicates how far you've been with it , mybe that would be the way ahead. If the cost
and investment was comiong from the battery makers rather than the car makers,
then you would not have the huge upfront cost of the battery ?
2 aspects on this. The OEM big boys are traditional business people , they think
we are the OEMs, we are important etc, they will not learn .
E-bay wasn't set up by American Express, it was set up by a completely new company?
This has happened "Build - Your - Dreams" BYD in China is a company making lithium
batteries and at one point they said they knew how to make the most expensive part
of an EV and we could just add the more basic bits around it and they started making EVs
but that is in China with a big market. The othe rproblem is , with that sort of
approach, no one really knows how long these batteries last or even what that depends on.
Not until 15 years have elapsed . When I was doing my PhD ten years ago and approached the
OEMs they weren't interested.
But if you did a growing thing, financial charging per mile , improving your technology
and adjusting you costs , you are reducing your risks from the battery makers
point of view?
Peugot have a leasing model but they buy-in the batteries from someone lease.
Because its a rolling thing then each year , knowing of the battery failures, they can
increase the cost of the leasing. For a jump in technology someone has to take some risk?
I've seen that sort of progression, the OEMs have tried and failed a few times and there
have been consortia forming. Saying what is an OEM in car making, go to their
specialist suppliers , batteres, motor, inverter/charger makers etc , ask them to develop their
vehicle, more-or-less , plug and play virtually. This is what we came across in the last years .
At some point the suppliers though , this is crazy. They make most of the money, we take most
of the liability. They don't know what they want, we have to tell them that as well.
Thats not a good business model. I've seen 2 or 3 projects now where there was a network of
suppliers, saying we develop the car together and then we share the profits , so there is
no car manufacture involvved at all. Its kind of moving towards what you said.
The same technology and the costs behind it is still underestimated. But I agree it
probably would work.
Your BMS, thats monitoring each cell , does that give long term data and whether one cell
is going to get corrupted or whatever . Can you hook that one out and put another one
in its place, because you know which one is failing?
Attempted but we don't know yet .
If you had a number associated with each cell, logged in your electronics , effectively
saying how many years life left ?
That is the aim. We even have aims of getting all the data from all the electric bicycles
in China because there are so many of them. But at the moment the electronics inside them
that add any datalogging that may cost only 50p is even too expensive. Eventually it may
work, the data coming from other markets where they are already sold in much higher
If you left an EV car headlights on over a weekend , what would happen, would the electronics
shut down so the battery isn't fatally drained?
The lights are supplied by a separate 12V battery and that would just empty the 12V system.
You've been saying that there has not been enough push from the OEMs to develop the EV.
What would you say was the main driving force that would lead them to heavily invest
and develop them?
I strongly recomeend we organise this movie, The Revenge of the Electric Car,
because it covers exactly that question. There are 4 completely different approaches.
1/ Nissan who are trying to push a small EV into high volume into the market where
it should be.
2/ Tesla motors where its trying from the high spec/high cost/low volume idea
3/ Small guys who are converting one vehicle after the other himself
Any of them can make it, that was our conclusion , they could all fail or all win at the moment.
In a large company , a company is not one person, that is one problem. One company we
worked with there was a big push from high up in that privately owned car
manufacturing company, a big one. He said this is the vehicle I want on the road in one
and a half years. Unfortunately there were too many in that company who didn't
follow that vision. They simply didn't have the same drive or push and for some reason they never
got removed from the project. A lot of blaming rounds and wasted energy.
What voltage are these batteries at normally ?
Depending on the size of the vehicle a small one like the Riva? could be 48 or 70Volt
but more normally 240V or up to 400 or even 500 volt.
(Gasps all round)
Each cell is what voltage?
About 3.6 volt, so you need quite a few.
Whats the cost of the likes of heating , lighting and air conditioning ?
I don't know the percentage but its a bit of an issiue because the electric motor is so
efficient that it is not producing enough heat for the passenger compartment.
At the moment the car is very non-intelligently heated . You blow lots of warm air in and its
going out. For EV people started looking at insulating the car or directing heat so it feels
warm without heating everything up, as only the driver needs to be warm. So seat
and steering wheel heating and more directed heat . I know they have tested cars in Norway
at -20 deg C and they still had a decent range but I don't know the figures.
There are other cold problems, you are not allowed to charge li-ion cells at below 0 deg C .
So you need to warm the battery and keeping the heat.
If the car has been sitting overnight at minus 10 degrees C, do you move off very slugggishly
or even refuses to let you move off?
Driving is not affected that much by very low temperatures. Its more the charging
and breaking that are the issues with cold.
Whats hapening to recharge times as that was one of the problems, you had to leave for 12
hours to recharge rather than 2 minutes like filling up with fuel?
We have always charged overnight around 5 hours or more. Its a different mentality.
We got to the point where we said, as we never have to go to a petrol station . Others would say
it is quite stupid to wait that long. But if its overnight or during the day when its not driving
anyway . It will not be the solution for everyone. If you go long distance then a diesel car
is a good car. If you're a taxi driver then the recharging will be an issue. A series hybrid
may work quite well for taxis and for buses . For a commuter the EV would be quite good.
Is there any likelihood of using flow cells? in cars ?
Probasbly not, more for electric storage, I think the responsiveness is not that good.
The overheads for pumps and things .
For the more ignorant amongst us - flow batteries?
A big tank of electrolyte , charge up the electrolyte and then pumped into one tank
and have a full charge of electrolyte . then to draw charge you let that electrolyte
pass through the battery and you get the equivalent of a much larger battery.
Your energy storage is still limited in the same way as an ordinary battery , the advantage
could be faster recharge , take one bucket out and another bucket in. But you need the
infrastructure for this . For EV batteries recharging need not be an issue, you could
recharge in 30 minutes its just that the power is not normally available. The normal
household outlet is limited to 13 amps. From a battery perspective there is no problem
to recharging faster .
13 amp charge limitation.?
That is the norm as that is readily available eveywhere. We are invloved at the moment with a project
where you would use an onboard charger on the vehicle which can be plugged into a special
grid connection , charge faster and also act as energy storage for solar cells. That would be
much faster but then you need special plugs , connectors and specialist installers.
Could the electric car be extended to bigger vehicles, like buses and lorries ?
I think it would make a lot of sense for buses , perhaps not pure EV but perhaps
series hybrid because the average power for a bus is tiny and they have really big
noisy high emission diesel engines.
In London they are using a number of hybrids already.
In urban traffic it always makes sense. If you go on motorways it makes less sense although
we were involved in a project where someone had an idea . On a big lorry you've got
somewhere at the rear of the lorry the wind is going in a funny direction which is
creating extra air drag and you could put a turbine there to use that energy and improve the drag.
Use that generated power to run the lorry when it enters a city- there have been other
ideas like that. It would make sense for the typical Ford Transit type truck running around
in cities, again series hybrid as your running around all day long and a pure EV would need
a very big battery.
For to catch on does it come down to whether they are more cost effective or more desirable
than conventional car. Can you see a future where people would own two vehicles, one
for town and another for long distance? Or do you see hydrocarbon fuels pricing itself out
an EV becomes cost effective that way? Do you see , in our lifetime, an EV winning the
likes of the Le Mans 24 hour race, and so adding to the desirability of owning an EV?
I've seen the first in the US where people often have big houses , lots of parking spaces,
and people didn;t want to be dependent on one fuel - this could be one driving force.
Not so much the cost but more , what happens if someone in the middle east switches off
the oil. Americans are buying neighbourhood vehicles, similar to a golf cart , but you are
allowed to drive on the roads . So an SUV for the weekend and next to it a small neighbourhood
vehicle for daily driving - thats become a US hit. Over here unlikely as we don;t have
that sort of space. As for Le Mans, there are EVs entering racing , we;ve seen especially
young people interested in racing electric stuff than combustion engine stuff, at universities.
More and more projects running electric boat races , they put electric into anything .
Once that generation starts taking over that could be having an impact. It will be many
things coming together, hopefully slowly and predictably grow. About 2 years ago everyone
went crazy about EVs. We got swamped with emails , phone-calls and it was very difficult sorting the
real ones of people doing something interesting . We were wasting so much time, the governments
all over the world putting money into these projects which I think wasn't actually helpful
because the same things were invented and re-invented. As you say why isn't
there just one battery type. All those government incentives meant thousands of different
designs and no-one comparing them with each other. And they're not even different sometimes.
So there are some funny things that should be helpful but are not always helpful.
Why the emphasis on cars rather than one-person transport?
Its starting now I think. I showed the 2-seater with the pedalling , that design is from
1985 or so in Switzerland. But it hasn't caught on yet. There is a lot of politics involved
and economical interest . When I stareted on my PhD I wanted to work on small and
light weight battery vehicles and I was told off, I was told if I continued on that
route I would not get a PhD maybe so I had to switch to hybrids , because people didn;t
see it as a mainstream solution at the time. It will come sometime as you no longer have the
parking-space problem , no noise, much more compatible with cyclists. It just leaves what to
with the large vehicles such as lorries. It is thos ethat people in small vehicles or bikes are
afraid of. It is not compatible when you are next to the huge wheels of a lorry. In my view
most of it should go on the train where they can plan everything and then small local
distribution vehicles. Its such a big change that I have difficulty seing it happen.
I can see many such things happening in places like China or India where they just build
a new city. They are also concerned aboout sustainiblilty but they can just do it.
Then in the future we will be buying the technolgy from them.
At the moment they don;t have personal transport in the most part and instead
of getting a car . Over here it is necessary to get people to give up their car . If you only
had a bike before, you're not givingh up anytrhing.
Maybe move to hire rather than ownership so chose a vehicle for what you need to do
on that particular day. ?
That can work , there have been some rental companies , renting out EVs. But I would
say it is the EV that you are using every day .
I saw something intriguing last week a reverse or alternative use of EVs ,
in places with an unreliable electricity supply , to cover the running of a house, the
electronics etc , not washing machines perhaps , you plug the EV in reverse
and drive the house for the 6 hours or so of a power-cut? We live in an urban environment
and very few powercuts but lots of areas of the world not so fortunate?
India , Brazil, China they do have power-cut problems. If you cannot make the battery
cheaper then one option is to lease it or such as that , giving it more benefit.
So one benefit could be that you are parking your uniteruptible power supply in your garage.
Once we get the smart metering. the smart grid you can actually earn money with the
battery because sometimes the electricity suppliers have excess energy and may be
paid to use it, they don't at the moment and then push it into the grid when they desparately need
it you get paid for that as well. If you take energy from them when they don;t want to supply
it then it it is really expensive. So then if you have a big battery parked in your
garage that can do this sort of thing then you're quids-in. This is another project that we
are going to bidfor with another UK supplier, competing with the Germans.
Earlier you showed graphed out plots with gradient lines , could you go back to that as
I did not understand it.?
I can try, its years ago I drew it. Its the Prius actually. Because its a power split
hybrid that means they can electronically decide, knowing how much power they need
to propel the speed that the driver wants and then decide whether this power should be
coming from the electric or combustion engine , in such a way that the engine only
runs on the optimum efficiency curve .
The gradient lines on that plot 250KW-hour , what are they?
Its gram per KW-hr , so fuel consumption per KW-hr, basically the fuel efficiency.
A pure EV would not have a gearbox as presumably pulse-width modulated drive?
Some do , some don't. It depends on the type of motor. If you go for permanent
magnet motor you've got constant maximum torque over the whole speed range
but thats not really wht you want. Ideally you want a lot of power at zero rpm and a lot
of power at high rpm which you can get from asynchronous motors for example
or the cheaper series wound DC motors . With brushless or asynchronous motor you would probably
go with a gearbox or you would need a massive over-size motor. Not necessarily
a standard mechanical gearbox, could be CBT? or americans use automatic gearbox ,
very inefficient .
If battery systems happened to become very cheap in the future, can you see them
being retro-fitted to conventional cars?
Yes. I can see that happening without the batteries getting much cheaper. The reason
is the typical car lasts more than 10 years , we will run into problems a bit faster than that.
The trouble is the car is already on the road and the new car , even the hybrids but
especially the EV , if something is too weird then the uptake is very slow in the market.
I am one of the people who believ that car conversions will be part of the mix, because thats
the only way to get enough EVs on the road fast enough in the near-term future.
There are companies now designing conversion kits . A problem is that sometimes such things
are not done very professionally and it can put others off. So you buy some unprofessional
kit conversion , the car catches fire , its in the press , that sort of thing , that will put people
off. If people do it DIY with 12V batteries and the like then I've also seen someone with a fire
in his garage from doing that. If that car is sold on then it is no longer under the control
of the originator and it then has to be foolproof. Even Toyota Prius users have killed
their battery because they decided to help someone by jump-starting a conventional vehicle
and they used the high voltage battery - maybe just an urban myth though. But I can
picture that somehow.
Do any of your vehicles have a deliberate noise generator, perhaps a Ferari engione noise
for warning other road users that a vehicle is coming behind them? otherwise they will not
hear an EV, or will legislation have to placed.?
The only thing is we have that is annoying on the hybrid , when we reverse it is beeping
but it is only inside - I've never understood why. Lotus got a grant to generate such a
noise generator. I think its a horrendous idea as I find it all too noisy out there so why
would I want to add to it. But you are right of course as there is a danger. Having
driven EVs since 1998 , you start driving in a diffferent way because you know you are
silent, so you are watching out more than other drivers, more considerately.
Places like San Fransisco where there are a lot of EVs it is very diconcerting
at a crossing and not being audibly aware of such vehicles,it makes you jump. ?
Have you driven one of these EVs in the countryside and come up behind a horse and rider
and spooking the horse? You as the driver is aware but the horse is not, it is very dangerous
if something emerges from behind it , that its not heard, can kill the rider?
One could introduce the noise like Lotus did , choosing V8 , diesekl or 2-stroke or
The noise is purely generated for people outside its not directed to you as the driver?
What I have seen is where you as the driver have the normal horn and a second one
, quiter but audible , so you can warn others without freaking them out, like a bicycle bell.
Monday 02 April, 2012 , Professor Paul Lewin , Soton Uni , on the Royal Society
Working Group considering the future of a pan-European electricity transmission network
Title: 21st Century Power Networks: The European Supergrid.
Changes in the UK's generation mix (e.g. introduction of windfarms, closure of coal burning
power stations etc) will have huge impact on power networks and this talk will discuss proposed
solutions at transmission voltage level (400 kV or higher) including the challenge of developing
(the world's first) offshore High Voltage DC transmission network.
3/4 hour presentation , 3/4 hour Q&A, 22 people
( First 10 minutes lost , due to a technical problem, covered history of the National Grid
, design philosophy of n-2 redundency for reliability, loss of high voltage labs around the UK. )
..... AC transmission is quite a good thing because working at high voltages and alternating
current we have a low current system so the loses are reduced . With overhead lines and
towers can transmit over significant distances . Arguably the greenest solution but
not everyone ses it like that. You have a system that is 100 percent recyclable at end of life
, incredibly easy to maintain. You can fly along in a helicopter with thermal imaging
cameras and check a hundred km in a matter of hours. So some of the issues we are going
to face, we;ve never had to face with htat sort of structure. Based on its age and what it does it is not
able to deliver the requirements of the next 30 years.
The future has 2 basic scenarios
1/ The supergrid , increasing the interconnection , probably at higher voltages and we
we join the UK to Ireland, Norway , already connections to Holland and France and we
can build on these . The transmission neteworks will shift arounfd the North Sea from
being on land to subsea and then brings in the concept of multipoint DC systems.
2/ The concept of a smart grid , implying the current grid is not smart buut it is very
smart. The real idea of that is real control of the demand side and trying to move away
from the n-2 coverage we have at the moment. In a smart system you would only need
one connection and one route.
If we develop for the future its like stopping someone on the street asking for directions
and they say , if I was you I wouldn;t start here. How t we get to this vision of the
future. That leads to some very specific questions.
What will the networks look like;
Is the possibility of an all electric future possible. For poiticians and the press its electric
vehicles , low thermal loss in buildings , can we do that. Can we carry on making the 400 KV ac
network work as we move towards this future.
How to make future systems operate reliably. In 2005 the Germans decided to open a line
across the river Oder and took out Italy. The implications of opening one connection
so a ship could pass, was a whole series of events , individual protection systems
of isolation stepped in , Slovenia saw huge power flows over 3 hours through its
country to Italy and then back the other way. Italy has got virtually no generation.
A great shock to all concerned that Italy was trying to supply Germany via Slovenia.
This was completely uinseen and unplanned for . So there is a big question around if you
create a European network and someone does something silly say Denmark howw do you
explain that to people on say the Isle of Man.
The National Grid Offshore Developement Information Statement is on the web.
They came up with a series of potential solutions. Offgem will then lead the developement
of the process
2 elements. The 400 KV network comes from Scotland where most of the generation
is, down to the south , going through areas of outstanding natural beauty. So additional
overhead lines are problematic and so the lan is to introduce point to point high voltage
DC links subsea. Some advantages, some potential problems.
Advantages of DC transmission over AC is number 1 , you can transmit over great distances
using cables rather than overhead lines. You're not using air as your insulator , some dielectric
material and the cable is coaxial so the outer sheath is earthed and no electric field ,
has a magnetic field but thats something else. The contract for the Western Link have been made
Siemens are building the converter stations at 800 million , to convert ac to dc and 2 cables
going to the west of the Isle of Man , 600 KV DC cables and come into the Wirrel contract
given to Prysmian. So 1.7 billion for a 1 GW connector and doubles the ac capacity
. There will be a lot of renewable energy coming from Scotland and we need to strengthen the
network. Stage 3 off-shore windfarms , very distant from the coast at places like the
Dogger Bank and for there a radial network similar to on land but via the windfarms .
Current policy is have a wind generator off shore and a cable from there back to land
, stage 1 and 2 farms were 132KV ac but you cannot transmit ac over more than about
80Km , because of the capacitance of the cable, you'd loose your energy
from losses. Wheras with DC we can go to 400 Km quite happily . There will be platforms
out at sea that will allow multipoint connections , which for the moment we don't have the
technology for. National Grid are going to spend 30 billion pound between now and 2030
to create this network. One of the more interesting things is such as the connection to Norway.
France connection is from Selinge , Kent to Callais , the converter is on land.
For the Norway link the converter would be on the Dogger Bank so maintainence could
be a problem. . The big plan is that the North Sea will become a high voltage transmission
network . ENSOE European operators plan , by 2030, about 5 GW of offshore wind
energy, allegedly. Generally the levels at dispatch , the amount of power available, at
any instant of time is between 15 and 20 & of that. On top of the 36GW of wind energy we
will have available an additional flow of 7GW from other European states.
When we are awash with wind energy we can store excess power in the Norway
Hydroelectric system, sounds wonderful . Bit alarming to realise what has to be
built in the next 18 years. At the moment there is only 1 cable manufacturer in the
world and can only produce at a reasonable rate . So unless someone builds more factories
this will not end up as a functioning system.
Can we rely on th eexisting netwrks to operate without failing. There is a finite current
that you can push through a piece of plant before it fails. Overhead lines, they sag
and so is generally self limiting, because the rate of sag is well below the cutout
in the overcurrent protection and takes significant amps before the sag is too much.
Other assets like transformers and cables get hot , you can fail a cable well below the
circuit current limit and has happpened a few times . In the UK last time was 1967
in London, the last big one was in New Zealand and took out cables supplying Aukland
central business district, taking 6 months to restore power, requiring a generator ship
moored in the harbour . You can't assume just that the network is there and you
start plugging into it , that will work. Wiht the 1.7 billion contracts , they satate they
will be operational by 2015, you would not even get through the planning process for another
overhead line, in that time-frame.
What will future networks look like.
From Imperial a plot of primary energy , where we get it from, what we then do
with it. We import a lot of energy as oil, gas, coal converted to some form of
use. Most is used for some form of heat, in that plot electricity accounts for
about 30 units of 130 units plus. If we go over to al electric with reducing and restricted
oil and gas , maybe more nuclear, more renewables. Can we suppport a transport
infrastructure, can we heat houses. Do the sums and we would have a power station
every 400 yards as we would need to generate 250 GW , or be available. We cannot
rely on renewables due to intermittaency and that is countrywide. If the wind is not
blowing strong in the North Sea then chances are its not strong elesewhere at the same time.
Perhaps a greater drive to waste heat recovery and building more thermal networks ,
such as district heating schemes, more ??? plant. The other is the idea of using more hydrogen.
Maybe by 2050 we will have fuel cells and transport the hydrogen . It is relatively efficient
to create hydrogen , using electricity , the issue is what to do with the exoisting natural
gas network. Can it be repurposed, doubtful reuse for hydrogen as it seems to pass
through everything including plastic piping. Would you be comfortable with hydrogen
coming out of your cooker. Maybe repurposed for thermal networks.
Whatever scheme emerges , seems to require more reliance onm bulk electrical transport.
Pic of a 400KV circuit breaker at Fawley . A grat advantage of AC is that you can break
the circuit when the voltage passes through the zero , available 50 times a second. DC
you can't , because always significant current and a real drive now to develop DC
breakers. Such an AC breaker covers a ground area something like this room ,
but same capacity DC breaker would be 5 or 10 times larger.
We will see more cables, ac and dc , cables because can be undergrounded and
out-of-sight then out-of-mind. Will lead to pan-European networks leading to more
point-to-point networks connecting UK and othe rcountries. hopefully that
would keep the cost to the consume rto a minimum and a better use of resources.
A study in Ireland , which has huge wind reserves but no demand to match it, there
model showed that the best option was just to shed the energy when they don;t
need it, due to the cost of creating interconnection..
So we want to connect and buy the surplus off them.
Concerning reliability of the existing netework, its like moving into an existing 1930s
house , existing wiring , and you decide to rewire everything except the kitchen,
then worry later on about requirement of reinforce the kitchen would probably
be greater than the whole of the rewiring.
An ac system with converters, one end converts ac to dc , the othe rback to ac.
You can have cables or overhead lines or in some circumstances back to back
vonverters. A large hydroelectic plant shared between Brazil and Paraguay
and somewhere else with different voltages and frequencies, so a series of back to back
converters. Also used in India where different states have diferent frequencies.
If I construct an ac link in parallel with a dc link it gives many advantages
Can control power flow more effectively.
The system is more stable, less likely to suffer from a system that contsains resonant
components, inductors and capacitors. Someone turns something off that can cause
nasty oscillations in the system but ac in parallel with dc allows us to control that.
Don't have to worry about fault levels as can bring power to a level that is needed
rather than pushing through the existing system.
Can transmit DC over longer distances. With ac as you go up in voltage then the length
you can usefully transmit power over reduces and at 400 KV it is 80 Km, beyond that you
can put compensation in but you are losing energy and not much coming out the other
end. With DC its 400 Km or more. Going pan-European can connect with
There ar e2 types of converter that operate at these voltages. Line comutative
converter - built from thyristors, talk about valves, where a valve might be +/- 60 KV
but its really an array of thyristors in parallel to get the ampacity up and series
for the high voltage . There are at least 70 line driven high voltage line comutative
systems around the world. The available rating has been going up all the time and
thanks to China , now reaching 7.2 GigaWatt . You have to use old cable technology
, modern cables with dielectric of cross-linked polyethelene which is low loss
and mechanically strong . Previous was to use paper and mass-impregnate with a
fluid/ mineral oil/ cable oil. Those cables are not brilliant acting as insulators ,
quite lossy but ideal for HV DC for when you switch anything off. Effectively the charge accumulated
within the dielectric will leak away quite quickly. But with an XLPE cable it stays there .
A whole sety of issues of how you turn these things on and off, how
you manage that. With line comutative its not really 2 way, works well if sending the power
most of the time in one direction. You cant use these on the like of windfarms , you must
connect between strong points . There is a reactive power issue which means that
you have to have compensation for that when connecting to ac. One majoor thing that is
often overlooked is you need a lot of filtering. When you look at the footprint of
a converter station it maybe a square mile, most of that is filtering. So high costs.
A line comutaive converter for DC to 3-phase ac will have 2 stages.
Graphic of summing 12 possible pulse heights and get a crude sine-wave and so the
requirement of lots of filtering as you don't want to put that into an ac network.
More modern version uses IGBT rather than thyristors which have the advantage that
you don't have to wait for zero current before turning them off. First developed
by company ABB first installed in 1997 . Their trade name is HVDC-lite because the
available powers available are not as large as the Line Comutative ones.
One USP of the system was they developed an automatic system for burying the
cable. On land with reasonable ground material s can bury 5 km a day
with little disruption. In terms of installed ratings it is going pretty high, ABB predicting that
by 2017/2070? they can have a 2.2GW connection with 2 cables, one at +600KV and
one at -600KV . The filtering requirement of this design is significantly reduced.
So the converter station footpring is significntly reduced. So ideally suited, possibly,
for off shore. There are 3 or 4 companies world wide, 3 in Europe working hard
to develop this technology, ABB, Siemens, and Alstom in the UK, different topolgies
and patents but basically the same. A vast number of capacitors that you switch in and out
in order to create a stepped waveform. The result is a lot closer to an ideal sine-wave, with
a lot less filtering requires. So far these ar efor point ot point, a single cable or pair
of cables , converter station at each end . That does not lend itself nicely to the idea
of networks. I would like say 3 converter stations and have a multipoint so
can send power in any direction .
So looking at applications , a large Siemens mockup one at sea, a significant platform.
So western and eastern link would be line comutative as they would be used for boot-strapping
and we know most energy would be generated in Scotland and we want it down here and unlikely
a sudden demand the other way. Then in the voltage source converters wil be used for
off-shore connections from off-shore wind farms because they don't require a strong source
at either end. They lend themselves to interconnection. To hav emultipoint DC interconnect
you need to be able to isolate the HVDC cable so if a fault develops in one of the cables
it can be switched off at both ends and still carry on with HVDC transmission.
That requires the idea of a DC breaker , 2 per cable, that technology has a long way to
go. What will be the impact of all thios on our existing networks.
Some of the work we;re doing at Southampton.
However good things are with HVDC we will still have some harmonic distortion , higher frequency
components, needing to make sure interconnections work reliably. Mor eand more overhead lines
are being replaced with cables. There isa n operational problem . The Wokingham
control centre needs to know how much more current they can push through a cable
before it melts. There is a calculation that you do and for it circuit they have a cut-sheet?
that tells them this and these sheets under emergency conditions show them to suddenly
increase the current in a circuit because of the finite time required to reach dangerous
levles and so if its only 6 hours long they can do this.
There is a fixed standard for that in the industry , well understood, but then you have the
example of Stratford in London all overhead line that had to be undergrounded and the issue
is calculating for one circuit in a tunnel but add other circuits and they all have an influence.
Last year there was not a method for calculating this , we've been working on that.
Cable tunnels were getting hot , so more and more ventillation .
Had to model this , computational fluid dynamics to work out where the air velocity
is highest, down the middle where there is no cables and the surfaces had the cables
soi getting hot. In London its an interconnecting network of tunnels , come out of a small
tunnel into a large riser and suddenly virtually no air passing your cable. That
bottleneck limits how much current you can put through that circuit.
A 132 KV cable , 11KV cables etc , gives info on how we move forwards to a smarter
grid , dynamically , say every 10 minutes the available ampacity for every
circuit in the UK is being calculated. Such work has saved the National Grid
1.2 million by saving the need for new tunnels that would have to have been built in London.
Faults can occur in such networks and a worry is, when moving to HVDC
, UK via the North Sea to rest of Europe. How do you guarantee that all that buried
infrastructure will operate reliably. If a cable failure there , instead of it being a matter
of hours before restoiration , it will probably be a matter of months.
Video of an American transformer that has been hit by lightening and eventually
it blows up. In the UK it happens occassionally , usually after some maintainence.
The fire brigades let it burn as no one at risk an dthe oil burning off. They
remain hot for months.
So need condition monitoring to identify potential problems , oil samples
taken regularly , analysed for disolved gases , like hydrogen
and acetylene mean remedial action can be taken and monitor 24/7.
But moving to very long HV cables , like 800 million pounds of asset in the Western Link
, 400 Km long, we can only monitor the first and last 100 km. No one has worked out
ho wwe are going to look after the middle 200 km.
A couple of weeks ago off Africa someone put a cable across a fibre optic link and dropped the
bandwidth to Nigeria by 50 percent. Someone putting an anchor through an HVDC
cable could reduce out supply by 20 percent. It could happen but we need to know
where its happened and do something about it. Just knowing there is a fault is not enough
, if an incipient fault is developing then create a prognosis of time before faiulure.
At southampton we developed optical techiques for long cables. In 2002 a long
cable was 20km and the 400 KV cable from Elstree to St John's Wood
considere dto be huge then , developed techniqes for self monitoring. So an optical
system sensor at where the cable is most likely to fail , always a join , a termination
or some accessory, directly coupled to an opto-electric modulator , the sensor voltage
modulates the light passing through it . So fibre optic filament fed from a very
low noise source, passes through the electo-optical modulator , altering the polarization
and detect that at a receiver , completely passive but only for 30 km tops.
Example of such a system in Germany to detect a fault in a 400 KV cable joint,
you could see internal discharges and measure them.
We've moved away fro mdoing quite small experiments in our HV lab of high field and low voltage to
big experiments with high fild and high representative voltages.
Some work on 3-phase cables for London , the operating company spent 3 million
pounds on data monitoring equipment, producing vast amounts of data and there wa s
no-one in the company who could interpret it for them . We replicated different
types of failure modes , giving the signals that would be produced , and plots
where this data is retrieved from their jumble.
A classic one is a jointer not wiping off the lead properly , creating a sharp point. The jointer
we brough in to create such a fault found it an affront to his professionalism
to so create a bad joint. Replicating when trees crush cables seing what results are made
i nthe data. We developed a single sensor system that could identify separate
problems via a data-mining approach. Some were discharge problems so potential system
failure others are noise. Moving ultimately to online systems that are entirely
separate from human interpretation. Separating out multiple sources is just the
first step. We need to develop those ideas to the point where someone in the control
room has a really clear idea of the information being presented to them and prognostic
as well as diagnostic - a huge area of research. With being able to acquire data at higher and higher sampling rates
we are creating a lot of nothing because invariably not a lot happens. We are working on the
area of predictive modelling and actually look at less measurements and when we see
divergence from what should be happening , we then go and look in closer detail.
It comes down to us not really understanding how electrical systems fail , the physics
is very much lacking understanding again an area of research. About 60 people working in out labs
EPSRC is a principal funding source. One of my projects is to set the research agenda for the next
20 years , involving about 10 universities. And to design components for the network for 2050
and we're working on a 5 GW cable. In Southampton we are fortunate to have one of the
4 high voltage labs in the country.
You said we could not rely on renewables even with connectivity to Norway for storage
of excess. Perhaps need to go to CHP and hydrogen. I've heard about , I've heard of a
pan-European grid that would extend from Algeria with PV and Iceland geothermal generation?
That involved ABB product. For us to import and export from/to the Med is reasonably
viable. When you scale these things up you end up with horrific numbers . Its often said that
putting PV panes in the Sahara , the air quality is so bad from sand and dust then actually its
not as efficient as you would expect. In terms of a smart grid, what we should have
is energy storasge and higher voltage levels. We have Dinorwych power station , pump-starage,
if we had more pump-storage that would nelp make renewables, far more viable. If you
look at the engineering issues, if I build 24 GW on the Dogger Bank , I need 24 GW
rated cable back to the UK, how many days a year am I going to be generating 24 GW.
How much am I spending on infra-structure for 1 or 2 days a year. Whiuch is why the Grid
is coming up with the radial idea and combining boot-strapping down the sides of the UK,
interconnection with othe rcountries as well as the wind on top.
When you talk of the difficulties of building supergrids and converters, does that mean we
are still many years away from having the technology to build a pan-European supergrid, we cannot
start on it now?
Its all happening , Google BritNed , UK to Holland is running but there is no
condition monitoring on that circuit beyond the fist few tens of km either end.
Do you see a potential for linking up all renewable sources throughout Europe and North
Africa, you could rely on renewables entirely, given all that versatility?
Its a very long answer to that . The short answer is, unfortunately, is that the more
generation that we have that is intermittant , the more constrained our system becomes
and we actually need gas generation so it can come on stream within an hour. If there is a sudden
drop in wind , we need to start generating elsewhere pretty quickly.
And also in the situation of if too much wind, feathering and stalling in high wind.?
Yes and the idea of people going out to Doggedr Bank , for maintainence is a bit worrying.
Isn't Iceland further than 400 km from the Shetland Islands/UK ?
With the technology being developed you would have a platform in the sea , somewhere
Where's Surtsy that island that emerged in the 60s?
Or on Rockall maybe.
In Germany I understand it that insurance on wind farms , they need to renew the
gearboxes every 4 years. ?
There are designs of generator being developed which remove the need for a gearbox.
Then the power electronics will fail instead.
You mentioned filtering problems with the converters, what would be the posistion of
mechanical rotary converters , DC to AC ?
The Ward-Leonard set, I don;t think anyone has thought about that. I guess again its
a question of scale and the powwers that are required.
How efficient are the converters?
The LCCs , they quote at 98% , this is the manufacture. The VSCs are less efficient ,
so when you are talking of 1Gw then at each end 20 MW of lost heat, quite a worry.
Its down to the limitations of the power device , because its built of silcon and the
turn-off takes a finite time and so lost energy. So a need to develop devices with a much
sharper turn-off, There is loads of money going into silicon carbide but potential
improvement is only about 15%
Is there any capability of capturing that heat?
Thats back to the idea that every power station will have a thermal network associated
with district heating. The converter would provide 20MW of heat but will you do with
that in summertime.
What sort of physical size is a 400KV dc cable?
Depends on the current but something like 6 inches diameter
I was expecting some sort of horrendous dielctric brakdown.?
Generally about 10KV per mm
Do you get long term electrochemical breakdown because its DC always acting the one way,
ac avoiding this problem?
Yes you do, a lot of the issues with DC cables is what happens with the internal field
when you switch them off or change the polarity in normal use. The question there is
about spaxe-charge within the dielectric and you can have significant charge enhancement within
there, that distorts the field, then when you change the polarity that wants to discharge
with relatively catastrophic results. There was a test done on a DC connector between mainland
Italy and Scicily, where they did 500 voltage reversals and the cable didn't fail, they went on to
install thsat cable. That worries me from a testing point as you are probably stressing that
cable more than it would see in its lifetime, camels and straws.
You say when you reverse the current in a DC cable you are creating a charge in the insulator?
You build up charge within the dielectric, the insulator as space charge, there is some condusction going on
, its not perfect dielectric . So you end up with say your core is positive wrt to the outer screen
so I build up a block of positive charge somewhere. I then reverse the core to negative and instead
of a nice graded field to earth , I've this space charge within the dielectric , which wants to go
somewhere and usually does a lot of damage. Basically a huge capacitor.
Will such capacitance have much effect of energy storage within a system?
That is one of the big issues in operational constraint. From a network 20 years ago
that was mainly indusctive and inductive loads of the likes of washing machines , we are moving
towards more capacitance in the network. In my opinion energy storage should take place
at transmission level. We should store for everyone and not at a local level. Some of the
smart-grid community think we should have pockets of local energy storage.
There is a lot of talk of people using their electric vehicles for energy storage but doing
the analysis , how can we ever have enough generation to support widescale electric transport,
without rewiring what we have lready. Wheras employing pump storage is pretty good
use of resources. Others feel differently.
Does Norway currently have pump-storage?
I,m not sure but I think they do have reverse pumping.
You are presenting a moderately bleak picture, are you projecting that electric useage
is going to rise steadily despite low energy light bulbs etc?
The input to your house is rated at 100 amps , so 20 KW plus. People say they are saving
electricity withthese new bulbs but what I really want is a dishwasher. The classic is
"my 4KW power shower is not really doing enough for me, I need an 8,10,12 KW one"
You have the capacity in the network to provide that. What you don't have is capacity for
all the people saying I will have a 50KW electric vehicle. Look on a street by street
basis and only so few people can do that before the distribution is overloaded .
In that 1927 electricians handbook it mentioned for Southampton a specific tarriff for charging electric
vehicles, presumably as it was overnight , like cheap rate night storage metering?
probably for docks vehicles, like milk floats.
In the early generators they would have lamp rooms , so a running generator set and as people
switched their lights on and off and lamps would go on and off in the lamp room and
encouraging a base load was a really good thing as they did not have the infinite bus-bar
that we have, by having so many generators connected at the same time.
Something I picked up years ago and I cant nail it down whether an urban myth or not. Before
the national grid developed , generator station A could be linked to another area B, and B to C
but no one ever dared to link the whole lot together, scarred of current hogging or something.
Then someone overnight in some precursor to Wokingham , a lowly technician, decided
lets join them all together and see what happens?
Probably very close to what happened
Are cryogenic power transmission systems likely to be viable?
We do a lot of work on (High Temperature Superconductor) HTS power plant because its a brilliant research are. One of my
top 10 achievements is to explore the effect of electric fields on the boiling behaviour of
liquid hydrogen . Under a high field , liquid hydrogen can no longer have convection currents
, it goes straight to nuclear boiling so you don't need a huge temperature difference
between a conductor and the bulk fluid . That is why a lot of HTS power plant failed
on energizing as high fields and slight temperature differences. The other area is on
solid insulation at cryogenic temperatures and virtually everything we have tried is
well below its glass transition temperature , so under fault conditions its very
brittle , just falls apart. There are some things , fault current limiters are a possibility,
these are a supercionducting coil, when it sees a very large current it ceases to be
a supeconductor and its impedance goes up so it limits the rise further down the system.
Some are now being tested on the networks. The problem it takes some time to recover
so it can be reused. What we work on is something called a fuse. Its cheap and if it
blows , someone comes along and replaces it - its cruse but is works. Fault current limiters
should ideally be automatic but the supeconducting ones may take half an hour
before its recovered to be reconnected. From the 1970s its been an area researched at
Southampton , low temperature supecondutors , Style etc, and everyone has always
said its 10 years on before being applied. So a straight answer would be in 10 years time,
we will be using HTS power apparatus, but I'll probably say that again in 10 years time.
If we could start again and for resisdential useage would we still use 50 or 60 Hz or
would there be an advantage to go to 200 Hz or higher?
Not really. The technology has grown with the limitations of working at power
frequencies. In small systems like on aircraft , it is useful but for long distance
transmission then 50/60 Hz is fine. It would be good if we;d decided on one though.
Japan has 2 islands at different frequencies , of 50 and 60 Hz and difficulties connecting
the islands .
I thought it was the limitations of such massive generating apparatus , you could not
speed it up faster for mechanical reasons . ?
There would be a transmission advantage going to higher frequencies , the problem with
going to higher frequencies is that some electrical failure mechanisms depend on the rate of change
of voltage. So go to a higher frequency and you will probably exacerbate those.
and testing you'd need a 400Hz supply , you don't just add new equiment to a network
you test it after laying and connection. A big problem with testing circuits using
resonant test sets , which oscillate around 50 Hz for a short period of time but
they are huge , to test a large cable. How do you test a DC 400 Km long cable.
Time-domain reflectometers only work with specific breaks?
Yes, what you're looking for is incipient electrical faults. It would probably be 0.01 Hz
very very low frequency.
Any developements with inteligent materials , informing back?
We are trying to develop smart insulation materials, looking at adding fluorofors?
, chemofors? and now liquid crystals, so polymer dispersed crystals. The idea being that
you could create paints or insulation systems that work , giving some clear
signal when they are energised , directly and passively to the transmission wires, not secondary
monitoring circuits. Like HTS its 10 years away.
A big question on efficiency. On the diagram of energy flow in and out of Britain
, the 2007 Imperial plot, it was 70 million tons of oil equivalent and about 1/3 or 1/4 lost in
generation and transmission. Is that unavoidable losses from pushing electricity through metal
and spinning things. Will future technology allow us to be more efficient in generation and transmission?
Part of that is that you have a daily demand cycle and an anual one as well.
So for example Fawley is spinning in reserve all the time , so burning fuel running all the time
and all that is doing is improving power factor . It seems a loss but without it there
would be other losses in the network. At the moment we don't have demand smoothing so
consequently there is large generators that are generating for large demands
and it is easier for them to carry on generating than turn them off. So various times of the
day and the year there is excess generation over demand.
Could we be more energy sufficient if the demand could be made completely smooth?
No. If you look historically at the interconnector to France that has been running
almost continuously into the UK for a large amount of time. Its only since the developement
of an electricity market that some of the coal generatiors are selling their excess energy ,
relatively cheap, exported via that link. The reason that link was built was because
of the string of nuclear power stations on the French coast , so they had excess energy
and we wanted it. By putting a 2GW link in it saved us building 3 coal burning power
stations. So we created a network that is reliant on interconnectors as it stands
because the generators that are gas powered , designed in their costings to come
on once or twice a year but they charge an awful lot to cope with that peak demand.
So even if you could smooth it all out , I'm not sure the system could respond
positevely to that. In that case the gas generators would probably just shut up shop.
People get this idea that smart metering is the answer and at 3 oclock at night
the alrm goes off and we all get up to make toast as its really cheap to make.
In extremis it will never work because as soon as you start to manage the demand
side you immediately remove cheap electricity from the system and the unit cost
will go up. Create a demand for cheap electricity and there won't be cheap electricity.
So I'm never going to get my free refrigerator from the electricity company
like the free compact flourescent light bulbs, doing the maximum amount of woork overnight?
I think you've got to take responsibility for energy useage yourself. The worying thing
is not necessarily the electrical energy requirements bu tthe thermal. First we need
to develop more thermally efficient dwellings and offices and also people need to manage their
own environments better.
But people are not using imersion heaters these days?
Its the gas consumption that is the big worry. The idea of carbon capture and storage
, and you do the analysis and the shear volume of CO2 is already 9 times greater than
the whole world's oil reserves from start to finish and you've got to get it in the ground.
By 2050 we will not be burning natural gas, just for economic reasons. So the thermal
demand has to be met somehow , that and transport are the big worries. Historically inm
the UK electrical supply was gfor lighting . Look at the growth for other uses of electricity
, such as domestic white goods and now for charging mobile devices
The source of electricity in Scotland is hydro , not hydrocarbon?
North Sea oil and gas comes ashore at Scotland so thats where they generate it.
In the power stations they are taking measures to reduce the CO2 generation .
The other thing about Scotland is that if you turn the gas and oil off they have most
of the renewable resources , good offshore wind and vast amounts of tidal power.
Whether they beccome independent or not, they will be selling their enegy to England.
The only proposal here like that is the Severn Barage which any analysis shows
us it would work quite well apart from you flodding Gloucester.
And once you build the barage you loose the local tidal effect anyway.
Its a pity we are not bathed in sunlight as that would help a lot.
In natural gas which is methane how much energy is required to break the bonds?
Out of my area of expertise
With Hydrogen it is 2.8 units of energy to break the bonds and 1 unit of heat?
Hydrogen is not the greatest gas to transmit , nationaly , as it would be lost
and matal embrittlement etc , so has to be used at source. Hence concept of fuel cells
and charging of fuel cells. The UK DEC? are looking into it , the big worry is all
the residential gas boilers for heat. From now to 2050 how do we make sure we deal with that.
You cannot tell everyone to go over to electrical heating without significant investment .
You have to make sure that everyone's house has its thermal demand at an absolute minimum.
I've seen a house where they much reduced they're energy consumption but they had
top make their house grow with great hunks of insulation?
Thats where you will get your money back, ultimatley. The figures show a greater
number of people going into fuel poverty . People equate fuel poverty with cost of
electricity and gas but also impinges on everything people do, its
transport as well. More developement of electric tram systems etc need to happen,
because we can't carry on burning petrol and diesel.
The petroleum companies decided to prohibit the 300 mile per gallon car ?
The best solution is probably the hybrid where you're recovering energy to recharge
batteries on the vehicle .
You said there was only one company making these high voltage submarine cables?
The Western link is 600KV polypropylene/paper layer , mass impregnated with
silicone perhaps , only one company is capable of making that . With LCC stations
you can't use extruded polyethylene so its an old technology , as craft paper
coated either side with polypropylene to make a sandwich, wrapped around the conductor
with butt gaps and mass impregnated to get rid of all the air , with an insulating fluid .
They are ideal for DC because they are not very good insulators so they don't
store charge to well, it leaks away. The risk with all this is you will build a network
, 400 Km long point to point cable, assuming buried 1m deep . That route is going over 19 other
interconnectors, oil, gas, fibre-optic coms and other power connectors and no one knows
how this cable will interact . With other power cables you cannot pass close to one another
because of mutual heating . Subsea they will put down a mound of rock over the
cable that already exists, lay the other cable over that and another mound on top of that.
Nobody knows how good that will be.
I read there wa sa global shortage of submarine cable laying ships for windfarms etc?
Yes . National Grid have let a contract for them to supply by a certain date
, Prysmian have there own cable laying ship and have just gone into manufacture , the first
section just delivered to the eastern HV about a week ago. Even if that factory is working
flat out it cannot meet the demand there is going to be and no other factory out there
at the moment. Politicians are very good at this sort of announcement. Saying we are going to
have a lot of offshore wind , that keeps the NIMBYs happy by removing them from they're eyeline
but no one considers that there isn't the infrastructure in place to deliver that product.
So you have companies bidding for projects and then trying to develop the support
technolgies. As far as electrical industries in the UK, we are leading the world , but whereas 20 or
30 years ago we lead it because we had huge levels of technical expertise , now we
are coming up with solutions but using external expertise to find the ideal ways of prgressing.
In the Western Link there is a statement in there about having a multi-point terminal
so an off-shore platform built somewhere so can have another link across to Ireland .
So ships can collide with these , so building on the likes of the Dogger Bank?
There is problems on the Dogger Bank for construction purposes. We are tied
into this , it is our 2020 solution.
The DC cables have copper condutors?
Subsea yes, on shore they are generally aluminium. How long before pirates start
nicking subsea cables. The Western Link is 2million pound per km.
Monday 28 May, 2012 , Dr Jason Noble, Southampton Uni .
Science: theory and practice , a look
at the extent to which current scientific institutions and
practices are supporting the efficient discovery and distribution of
3/4 hour talk, 1 1/4 hour Q&A, 16 people
The talk comes in 3 parts. What is science meant to be about, philosophy of science , to give us a bit of
perspective on where did it come from and what are we expecting from it anyway.
A current assessment of the institutions of science and their health , an acknowledgement that
scienc ehas certainly improved our quality of life but asking the question , are the current practises
and institutuions fit for purpose , given the changes that have happened in the modern
professionalisation of science. That includes the new communication media for sharing of
results etc, its not clear that we are using that to maximum.
A simulation model of one aspect of the science system discussed in part 2, where we
explicitly ask the question, is this practise being done in the most optimal way and
if not , how can we improve it.
A theme of the talk is this idea of historical inertia in system design.
Consider for the moment this strange system of the Americal electoral college.
If you are voting for a president in theUSA you don't get to vote directly for a president.
You get to vote in a local state election and the states members of your electoral college
somehow, state by state, 55 votes if you are in California for example, will be allocated to election
of the president. There is this body that stands between you and the president.
This was probably a good system in the days of a horse based communication system and
transport network. The idea being it would take too long to get results in , why not a bunch
of independent local elections and then send some guy to Washington on a fast horse
with our local results and that can be collated there. Now that its there its very hard to change for no
better reason than cultural entrenchment. Not a rational system , quite archaic, but they are stuck
with it, so it would seem , despite some of its manifest oddness and even unfairness.
My suggestion is that there are similar things going on in the modern scientific system,
things that may have made sense once but are now not fit for purpose or at the very least
could be improved.
Including our habit of publishing in commercial jopurnals , hidden behind paywalls to make
it difficult for people to get to them, som aspects of anonymous peer-review which is a
mainstay of the scientific publications system but not necessarily good at what its supposed
to do, the notion of competitive funding application, and the problem of over-subscribed
PhD programs. The way that academic research is set up a bit like a Ponzi scheme where
gullible PhD students are lured in , used as cheap labour, with the promise of research jobs
that they will never actually get.
What is science anyway
From the philosophy of science, to give us what we think we want from science.
Historically science was a move away from some traditional ways of knowing things,
the notion that knowledge comes from god or from some authority or from moments of
mysterious revelation. Instead trying to understand the world by carefully looking at it,
and figuring out how it worked. THe idea of putting questions to nature, it came out of
philosophy, originally called natural philosophy. Tends to be dated to the 1500s
, whether revolution or evolution is one for the historians. In 1543 there were two incredible
books published, Copernicus on the revolutionn of the heavenly spheres and Vesalius on the
fabric of the human body. 1543 works well if we must put a date onthe possibility of a scientific
revolution. science takes off from that point. It doesn't take too long before people start
being critical and start self-examining the process of science. How are we seeking to discover
new knowledge, how is this exercise supposed to work. The earliest idea about the process was that
it was inductive. The caricatiure was that you would go out into the field , contemplate things,
bring a notebook, notice some sort of regularity that would impress itself on you
and discovery emerges from that. So for example lots of observations of the sun coming up
in the east , you induce that there is a natural law that says the sun comes up in the east.
Many people, starting with David Hume pointed out that justifying knowledge that you
gain from induction is very tricky. Simply , that the sun coming up every day last week doesn't
guarantee that it will come up tomorrow . From Bertrand Russel there is the turkey
that inductively concludes that its fed at 6 o'clock every morning and then one morning
at 5 o'clock its head is cut off , because its christmas. Inductive theory building can always
be unsafe. You get into circles about that. Science has done quite well out of this, but the problem
is its using an inductive argument to justify an inductive argument, pulling yourself up by
your own bootstraps. What we are doing must be something more than pure logical
induction. I think we're doing a much more pragmatic task.
Karl Popper, well understood induction, he could see this caricature of observe and induced
natural law was not a good account of how science was actually working. He noticed a
asymetry in this process , lots of positive observations proved nothing , but a single negative
observation can destroy a theory. If you believe all swans are white, all it takes is one black swan
to shatter that idea. So science was about trying to falsify hypotheses. Falsification was the
big thing. 3 consequences. Experiments should be designed so they could test and
potentially reject hypotheses, science knowledge becomes the list of things that we have not yet
falsified, stuff on the table that we have not yet falsifies is the best we have so far , and for something
to count must have the potential to be falsifiable. People presenting ideas that are so vague
as not to be capable of being put to the science test are not in the science game. Effectively
applied against the ideas of Sigmund Freud, much in the news at Popper's time.
Thomas Kuhne - real scientists just don't do what Popper says they should do. They don't
drop their favourite theories when some new piece of empirical evidence suggests that
they might be wrong. They do anything but, they cling to their favourite ideas and tend to go
down with the ship. Sometimes for science developement you have to wait for the previous
generation to die or retire before new ideas will get into the mix. Science tends to
be done via a particular paradigm, resistant to change of paradigms. He stated 3 stages
of science, we start off in a particular field where a scientific attitude and a chaotic mess, sorting out
what might be an initial perspective on the problem , that normal science is what scientists
do most of the time , experimental work that is designed to support what ever is the
dominate perspective and when you get some whacky inconsistent result you tend to
assume contra-Popper , that its a mistake - I need to check that again , that can't be right.
Finally Kuhne spent most of his time talking about revolutuionary science where after
enough results come up , difficult to account for in the old way of thinking , some creative person
creates a paradigm shift , a new way of looking at the problem, and we are launched into
a new phase of normal science under the new paradigm. Examples would be shift
from Ptolomeic to Copernican, Newtonian to Einsteinian or in softer sciences the
cognitive revolution in psychology , the jump from a behaviorist perspective to computationally
inspired modelling. Kuhne furthe rsaid that opposing paradigms were incomensurable,
meaning that fundamental different ways of viewing the world you cannot have a foot
in both camps. You come up believing one thing or the other but not both.
A useful extension to this the work of Willard Bennon Kleine ? and Pierre Duham in the
middle of the 20C . Kleine said that our statements about the external world face the tribunal
of sense experience , not individually but as part of a corporate body. We never reject a single
hypothesis , whenever some contrary piece of data come in we're not going to dump our
favourite hypothesis on the spot. If we observe the radius of Venus is a bit bigger than we thought
then its not just our ideas about Venus that are in play, but our ideas about optics and telescope
construction. We may have a whole package of ideas that need adjustment and no single experiment bears
solely on hypothesis at a time. These packages, theories or models , after Kleine we come to
science being the pragmatic process , deciding which groups of hypotheses need to be
modified or dropped. Kleine was attacking a complacent confident view through the 20C
called logical positivism that also came out of Vienna . The idea that there were 2 kinds of
knowledge, either analytic truth something you can prove logically on paper or
pure empirical knowledge, you could justify it by reducing it to statements of sensed
experience. Kleine wrote a critique in mid 20C pointing out there is not a strong
binary distinction between those 2 , more of a continuum and we end up with no core
analytic truth , just a messy pragmatic proicesss of sorting out which models work
out best for us in a web of interconnected proposistions. The network changes all
the time as we make new discoveries , at the core of the network these heavily interconnected
claims that require huge revision of our knowledge if they ever turned out to be wrong.
So such a thing as an electron or a gene is now so multiply supported from different
directioons that we can imagine a contradictory observation but it would require a
massive readjustment of our conceptual scheme to accommodate such a new
proposistion. At the periphery thats where the action happens in science.
Newrwe uncertain ideas that may well have to be dropped next year. Ideas in string theory
are certainly out there , at the edge of what we know , people are appropriately tentative
about them , they are not definite yet. Kleine's point is thats the boat we're in.
You have to drop the idea that you can obtain utterly secure analytic truth. You're always
in the messy pragmatic business of trying to figure out how the world works by proposing
Thats my take on science. Its not like mathematics , a weird window onto Platoic truth
, its a messy business of model readjustment. Its done some pretty impressive things.
Its done somne nasty things as well. But how is its health in the modern era. What are the
institutions and practices that drive it. I'm referring to things like journals, universities ,
research councils, the whaole system. We can certainly note that scienc ehas successded in
its basic task of increasing the storehouse of knowledge. We know more about more parts
of the world now. You can say that science has led to economic growth , producing all sorts
of technologies , leads to improvements in quality of life and people live longer , cured of
diseases that used to cause a lot of misery. Since the second world war science has gone
through this massive institutional growth. The number of scientists and the number
of published articles , something like a 40-fold increase since that time. Scienc ecan no
longer be the Victorian gentleman's effort that it once was. What mifght thios change in
size and scope of the enterprise have done to the institutions. A map of science with
colour coding of different subject areas. Biological. physics, material sciences, softer sciences
of education , soicial science etc. The modern scientific landscape is enormous, competing
and not so competing paradigms, disciplines , specialties etc.
Anothe rgraph , starting in 1817 , of the average number of cited papers per year.
200,000 line up to a million a year. From 1817 through to World war 2 , there wasn't a crazy rate of
increase in the world's scientific output . For whatever reason there was this massive increase
in money invested in university science , number of peopkle and so number of outputs.
Is it all going well then. Its my contention that science is in danfger of being a victim of its
own success. One thing that is key, is that the knowledge that you get from blue-sky
science is not going to be directly convertible into economic benefits in the short or medium
term . Things that have immeditae commercial pay-off, R&D, tend to find if you chase a blind
alley the market will check you promptly. If your product doesn't work, no one will buy it.
In some of the areas of softer science in particular there can be this feedback cycle problem , where
we find out what is accepted knowledge is not being evaluated by some outside force but more
a sort of jury of your peers. This means if its other scientists doing the judging , no
external feedback, then a risk of a pathologic dynamic emergiong in the system.
For exampe the link between some proxy measure , something like the number of papers
you publish is a proxy measure. What you/re really after is new knowledge. Hopefully it is
a good measure, but it might not be. So perhaps not as good a correlation as we would
like for measuring success or contribution and the actual task at hand, finding new stuff
about the world. An interesting parallel with a problem from evolutionary biology.
There is a distinction between natural and sexual selection in evolution , eg peacock's
tail feathers, through a runaway pathological dynamic the male tendency to gtom a long tail
and a female tendency to go for long tails has moved away from the core business of
survival and have entered their own strange mutually reinforcing dynamic, in that elaborate
tail feathers don't help them to survive , they just give an advantage in finding a mate.
You could argue , in the scientific realm , publishing lots of papers on the same topic
might not be a great contribution to knowledge but its a good way to further one's career and
stay in a job. Its just one of the results of the system in play.
Some examples of the way this pathological dynamic may have played out. Also the idea
of historical inertia. Wher ethe scientific system is run one way but actually but...
A thought experinment - If you were starting a new country tomorrow , that had no institutions
, no universiteies. You're the minister for education , what do you do to set up a new university
and research system. Would you copy everything of the current one. Do we live in the best of
all possible worlds, or some things done differently. Competition is a background assumption
, seen as a healthy thing, motivates people. But excessive competition can lower their efficiency.
So the modern struggle for resources, between scientists, could lead to such high levels
oif competition that colaborative action , like sharing data between labs, is not happening
as much as it should , if you remember that our real goal is the advancement of knowledge
as a whole , rather than the advancement of any one person's career.
Funding allocation - a truism that academic work is more reliant on third party funding ,
big charities, government research agencies or from corporations. Researchers end up spending significant
amouts of their time on the practise of grant writing . You can argue what this process does
in influencing the values and practises of science . Is it pushing us in the direction we would
hope to go or simply a way of outside forces controlling what scientists are supposed to be
looking at. Are there other ways of allocating funding that would lead to better system efficiency.
This is not just finding grant writing too boring, but are w elocked into a silly systrem here.
Could there be more sensible ways of spending everbody's time. Access to published results.
We are in a strange situation where scientific publishers charge scientific organisations and
everbody else large sums of money to look at the very work that we have been producing.
This is the strongest historical inertia argument. Hard copy distribution of results was
once a very valuable service , before the internet, we really did need their efforts at disatribution.
So a reasonable thing to pay them some money in order to do so. I don't think it is any
longer a reasonable system, its just a system we're stuck with, because that is how itrs
always been done. If someone says I'm no longer going to send my articles to Nature any more,
I will open my own open-access journal instead , this may not be a good move for your career
. In the Simpsons version of the US a pair of aliens are running for president and which one to
go for. It does not occur to them that not having a lizard like alien would be a better option.
So individuals and organisations have been looking for othe rarrangements.
More of a push to open access journals, a huge moral case that tax-payers should have
access to the research that they have funded . 30 or 40 dollars per article or thousands per year
they have to pay to view, the research that they have effectively paid for.
Publication bias. Another topic from a collection myself and others are working on to build models
of this stuff. I think , because of the competition for space in journals and effectively a competition
for eyeballs , same as the entertainment business, people want their papers to be read.
There is then a bias to publishing, at the very least, positive results , preferably exciting,
controvertial , sexy interesting results. It leads to a failure to publish either replications of
work , as seen boring, or contradictions - where I tried that and it didn't work. Send that to a
journal that has published a large result last year and you will find that its quietly buried.
This leads to individuals to search for a significant result and then post back a statistically
significant result and then post-facto interpret/explain that result in a way that isn't
really in keeping with logic of what good scientific thinking is all about.
This has serious consequence for the global error rate in published literature. A paper
be John Ionedes? from 2005 titled "Why Most Published Research Findings are false"
he's a statistian , a keen Bayesian and his analysis given these publication biases , if there is
a bunch of people out there searching for the magic p< 0.05 figure in some set of results before
they will send those results off for publication and the tendency for the publishers to only
accept things where that condition is true it is easy to build a simple model to show that
, where p<.05 should mean only 1 in 20 of your published results are false , this bias can bump
that up to 50 percent very easily. We see this in the popular presseg Carrots shown to
cause cancer , carrots shown to cure cancer , alternating year by year. Someone somewhere will
find some lab results that show a slight statistical edge for carrot effects and be interesting to both
journals and the media and get a lot of attention. But not necessarily the best way to find
the truth about the proposition inthe long run.
Anaonymous peer-review . Its been frequently criticised . Why is it anonymous. The idea was to
avoid undue power and influence , a healthy positive urge behind it. But in practise I don't
think it works that way. I think the key problem is not thinking about incentives very well.
Because reviewers are not really rewarded for their efforts , nor accountable for what they
say . They are much aware that they are behind a cloak of anonymity , that leads to situation
where people either opt out of the reviewing process entirely saying I'm busy , I'm
publishing important results, I'll leave the reviewing to others. Tragedy of the commons situation
because good reviewing would help everyone. Secondly its easy to get poisonous or spiteful
or lazy reviewing because there is no quality control beyond the editor , on trust, to choose
good people. That process is not transparent. From a game-theoretic perspective
reviewers eithe rhave an insentive not to participate , the costs of reviewing , or an
insentive to criticisize other peoples grant applications or papers because it reduces the competition
in their situation. A lot of the public reviewing alternatives , there are healthier reviewing
situations that we see on websites all over the net. This person's reviews are particularly
valuable, if such reviews were permanently attavched to papers . Things like reviewing
on Amnazon or Reddit are probably healthier than some aspects of anonymous peer review,
because the insentives are right. I'm not talking money, just recognition , people will do the
right thing for recognition. They will devote much time to this effort if you give them
something that counts for something.
Sustainable PC programs. If science is worth doing then part of the process is we need
to educate the next generation to know how to do it. At the moment there are probably too
many graduate students in the system. Too many to gain employment at a faculty level
. Some authors have compared the whole business to a Ponzi scheme, where there is deliberate
synical recruitment of thousands of people at PhD and post-doc to use their cheap labour
for the credit to be taken further up the tree. I'm a middle ranking academic and have
graduated 5 PhD students to the world, I'm working with about 8 or 10 more at the
moment. In a zero growth university setting I should train one person to replace me
when I die and then I should stop. But I'm part of a system that says its entirely normal
to have 5 to 10 PhD students at any one time. The maths of that is pretty obvious.
Not supportable, somrthing has to give there. I wouyld be happy to train large numbers
of PhD students as long as it was clear to all involved that all those couldn't
get faculty research posts.
The practical side where I try to convince you that the issues I've raised are kind of editorial
, I'm not proviong anything is wrong, just raising the possibilities and giving my opinions.
Acolleague and I chose to look at the problem of funding allocation. To turn the lens
of some of the tools that science has engineered , so game theory, computer simulation ,
optimality thinking on its own techniques, practises and institutions to see how we scrub
up. Is our way of doing things really supportable, could we do it a better way.
So part of a general problem of allocating resources, all over the place, the NHS, the military
etc all sorts of organisations. We have this much budget and we have a million departments
that want as much as possible, how should we share it.
Generally thereare 2 approaches, centralised by sorting your candidates on some quality scale
, who needs it the best or wll make the best use of it . There is someone or agenvcy in the middle
who says the fgreatest need is over here. There is the decentralised approach that is often found
in economic systems, something like an auction or a bid-tendering process in maybe
a construction project. Where a candidate is individually prepared to indicaste their own
quality , you have to make a case. We will farm it out to all the individuals who are desperate
for this resource . OK convince me, right a 10 page report on why you are the best person
to get this money, and repeated for the otheres. I will look at them all and giove out the
money accordingly. The academic funding system is very much in the second category.
Note that assessment there is shifted to the candidates . How much time , preparing this
bid, is it worth . I may spend my entire working life on preparing these bids , I don't
know whether behind the scenes I'm rated poorly , and am unlikely to get the money
whatever I say. Then its a huge waste, I should stop doing it and find some othe rform
of work. In science the real goal is real knowledge discovery.
In UK there ar e7 big research councils, about 3.5 billion pounds a year . This is allocated
by competitive bidding process ,submit a proposal, that is reviewed , ranked down to what that
panel can afford. Some slice of proposals across the top of the pile, get funded. Most are unsuccessful.
The ESRC rate in 2009 was about 10 % success rate, the EPSRC in 2009 was 26% its now
saying its lowe < 20%. The BBSRC is only 13% , so 87% of people sending in propsals
there are not going ot get anything for their time .
An example from the USA Pensylvania School of Medicine , around the time Obama injected
a lot of money into the NIH and NSF , allocated as usual , competitively and led to an insane
level of time allocation for this big carrot. About 2 monthhs spent on this , not spent on
teaching or research. Grant writing in and of itself helps no one.
UK is proposing cuts in the scienc ebudget , screening out medicrity , important to give the
money to the right people. How efficient or inefficient is the current competitive bidding
system. You should expect some waste in such a system , it will never be perfect.
The waste that we see, is it due to funding the wrong research , the wrong guys,
escessive time wasted on proposals. Or a mixture of those and are there other systems
that might do a better job.
So we put togethe ra simulation to try and tackle this problem headlong. We imagined a population
of academics, competing for funding , they are noit all the same, varying in quality .
So a variable Q that describes all the properties that you would want in a researcher.
A high Q means they are particularly bright, particularly productive , good managers.
In a perfect world we want to give the money to those who are going to put it to
best use. Just like the real world we say there is only enough money in the pot to award
some of the applicants. We had 4 out of 10 , given some money. We've simplified the
world of grant application. Our model academics can only hold one grant at a time
, real world is multi-year and try to have more than one, overlapping , different lengths
and collaborative, a large team . Simplified to one academic fighting for one
year at a time. Why do they want it, think of it like a force-multiplier ,it allows you to get
more done. So if you get the grant , it increases your research output by some factor
. In our case we said if you get the grant then you can get 125% more output than the
solo research effort with no funding. So 225 % more than normal. So more than twise as effective
because you have 2 or 3 post-docs to help you out.
The strategic dilemma for each of our agencies is , you have to allocate some fraction
of your time to writing a proposal , but this reduces the time you can spend on directly
doing research. Where is the right balance. Perhaps that balance is different if you're at the
top of the tree , in terms of quality, than if you're at the bottom of the tree.
So a year in the life of one of our agencies , proposals allocated much as in the real world,
ranked and sorted and successors get their grants. The year pans out, the academics
get some research output. At the end of the year they update their research strategy.
Based on what had happened to them they may decide next year to try harder or perhaps
I can coast abit. Repeat for 50 years of simulated research.
How do we measure system performance, what do we count as susccessful / efficient
outcome ech year. After the system stabilizes , we then calculate the amout of research
output produced unde rthe given parameters - number of papers if you like. We also
calculate the hypothetivcal quantity ON which is how much research would have got
done if no funding at all. The return is the amount of additional research work that you
buy by the injection of funding. So some graphs of "units of knowledge" UOK .
If you were to just give the money out at random , some agents are high quality, some low ,
even the low quality ones occassionally discover something and the high quality
ones do it more often. You would get 25 units as your return on investment
so a baseline. If I had a god-like view of the situation and knew exactly who was
best , who was second best and I distributed the money accordingly that would give
you 40 more units of knowledge. We csannot do better than 40 and we would hope to
do better than the 25 of random allocation. How do the simulated academics update their
strategy. the fist strategy we call the thermostat . If I got funding last year I will back
off a little , I've hit the desired tempersature, I will spend 10 percent less time
on grant righting. Our academics devoted somewhere betwee 0 and 100 % of their free
time on grant writing. May start at 30 percent, no grant that year, try 40 percent still no
grant , 50% and so on. If you do that you get something much worse than random allocation.
Only 10 UOK in that situation. People ar enot thermostats, we will give them some memory
, I can remember what happened last year , I canremember the last 3 years and rationally
update in some way. I'll stay flat unless successful 3 years in a row then I can relax. A window
of memory and can opt out of the system for a few years , then occassionly dip a toe
back in the water. In memoryB we did not want to be accused of creating straw-men
, longer memory window. With the most helpful tuning of parameters we could
manage they did get up to the level of random allocation. Using a window of about 6 years of
info for decisionmaking and all get is the same result as random allocation.
So what is going wrong, where isd the waste coming from , is our system giving the money
to the right people. So a graph of this Q , from 10 to 1 along the X axis and rate of level
of research quality . It seems the right people are getting the grants. Its just there is a lot of
wasted time on the allocation process. There is a fall off about .6 or .5. People of low
quality are never getting any grants. Life is up and down in the mid range of Q
but they have to live with that.
Finally , surely we can improve on random allocation. We looked at centralised
distribution processes . What if we took 5 % of the funding pie on paying people
to do assessment from the centre. A lot depends on research reputation, how well you're
regarded in your field, whether your peers think you're any good. We have fantastic
bilbliometric indices thes edays , the H Index? , they corrolsate very well
with who gets the grants. Given they corrolate well why bother spending all that time
writing for grants. Give the money to the top people as you probsbly know who they
are already. Doing that you get something closer to optimal allocation.
Say once a year you grab your academics, you drive them to a house in the country
and say write the proposal in 3 hours and you don;t let them prepare. At least doing that would
not waste sdo many other days in the year. We could add more parameters to that model but
I don't think there would be great changes and the same message.
I think there is this terrible blind spot in that science has changed socially and institutionally
because of the massive rush incoming numbers, near exponential growth , and a professionalisation
from something that was once amateurs. It seems odd that w eare loathe to apply our
own analytic tools , turning the microscope inwards. Are we running this system in a way that
makes any sense. If your an economist you may comment on how Sub-Saharan Africa might sort
itself out say. But ofddly we are loathe to look at our own ways of doing things. And
put ourselves up for self-criticism.
Non-speaker note to the audience.
For anyone not in the academic world and begrudges paying big bucks to Elsevier
or Springer or whoever, for science journal articles. There is a process called Technical
Journal Requests , via your local library , which then goes via the British Library.
Something like 2.50 GBP and article and 10 pence per page copied/printed.
If you go to an ordinary desk clerk at a local library they won't know what you're talking
about. Only at the higher levels they may know about it. Its nnot immediate
after publication, something like a 3 month delay.
(general audience unawareness of this)
I've even talked to a Hartley Library senior librarian and he had not heard of it.
Its some sort of contractual obligation , to be able to sell their journals in the UK
they have to abide by this secondary route , to let non-academics into
this academic world.
(Audience member) I can confirm this , as I've done it at my local library branch.
You write out a card and they send offthe request and it comes back to you.
The other thing worth doing , if you have been at a British uuniversity is to ask what
arrangements they have for alumni, I found out after 30 years. My own first year degree
university allows on-line general access . You sign up for it , registering with the university
library website as an alumnus and you can get journal access - for Exeter anyway, I
don't know about others.
(Another audience member) Southampton university don't do it.
(from an unattributable Hartley source - there was discussion about the possibility
of day pass for electronic access , physically at the Hartley, to anyone not
necessarily alumnus or academic.)
Speaker: Don't be shy of emailing the corresponding author, every such author I've
contacted. Particularly say South America where you feel guilty that they cannot
get easy access , a lot of authors will probably send you a pdf.
(audience member) Thats where I've found Scholar Google useful , that list of
author's contact details.
There is a lot of reasonable ways of attempting to circumvent the whole madness
of Elsevier etc charging so much money. Southampton has a system called E-Prints
which is supposed to respect 13 month embargoes and the like . I think many put their
material on there and wait for a cease and desist letter from the publisher
Q: A year from print date before such access, the BL system is a faster access than that ?
I think most academics ignore that.
Q: Isn't it a bit confused because often title and abstract are available before publication date?
Q: Welcome are supporting open access , aren't they?
Exactly, increasingly moving that way. A bunch of powerful players in system are coming around.
It seems to be hitting a tipping point at long last.
Q: There seems to be an assumed correlation between competiveness and being a good
researcher. and to me thats not obvious? And I assume it has an impact on the research
that is carried out? If the competitive people get the money the research will be going in a
different direction to the others. Secondly the ability to write applications , which might imply that
people with a broader range of abilities will win, over those with a more narrow . I would have the
thought the narrow more specialised could be more valuable research.
Yes. I think that's both excellent points. You could easily extend that model to a multi-factor
model . That's one of our ambitons, to stretch that out a bit and say , really there is a bunch of
different abilities at play here. One might be grant-writing ability/ rhetorical skillls for
writing for a grant - please give me money argument. That can be quite separate from how
the core research activity that you are. It may be related to your ability to write scientific
communications but personally I find the 2 kinds of writing completely different. I'm much
happier writing poapers than grant application proposals. I think that writing grant
proposals is something akin to writing advertising . I think a lot of people who go into
science , well they didn't go into advertising. They went for knowledge discovery and they
discover belatedly, certainly in my case, I'm going to have to do a lot of this arn't I.
I think it would be interesting to go to people like David Willetts , who have some power
over how these systems work and say - these are the different factoirs at play , core research
skills, people's rhetorical grant writing convinsingness skills, how closely do you think they're
related. At one extreme of the model, assume they are unrelated, un-correlated. Whether someone is
a convincing write of proposals is quite different , to being a good researcher . Its a self-
selecting odd little population of scientists. In the general population the grant apply ability
and ability to do research may be well correlated , thsat does not mead across all humanity.
But in the odd corner of humanity that ends up at university , that relationship may
have broken down, because you've already selected people through a lot of filters.
They have to put up with the university environment , long enough to want to stay there, that
the correlation may not be true. I would start with hte assumption that they were unconnected
and then ask , look, here's a system where clearly what you want to do is somehow tease out
alink. What the system purports to do is give the money to the right people. Its trying to
find quality of research skill and hand the money over to those people. Its not unreasonable
to build a model that says , yes what if they're separate , or only partially correlated ,
how bad is it. And indeed once it can be demonstrated a poor link then apparently
radical proposals , like random allocations , look a hell of a lot more reasonable than
they might at first sight. Which , mad as that sounds, I would not .
I think they may be negatively correlated, being that anyone who has such a posistion, he is either
really good or is good at selling himself.
Q: I've heard there has been a statistical study on the real impacts of research . It strikes me that the older
the researcher gets , then the more likely he is to get a grant proposal , from repeated trying, or
get highly rated . But important discoveries are actually made by researchers in their formative years.
Those are the people unlikely to get grants because they haven't got the qudos associated with
getting a grant. Anyone studies whether an Einstein or a Bill Hamilton would get a grant?
Yes, we dared to do a literature review and one thing we found again and again was that the most candid
discussion of the problem came from people right at the close of their careers , when , finally they
felt they could speak freely about how crazy the system was. You often get emeritus professors
saying , this is madness, but they didn't say that when they were 35. The data is there , we ahave the full
computer version of the citation network of modern science. The problem with research significance is you don;t know
what it may lead to. But with the benefit of hindsight we can look back and say , herre are what turned
out to be 10 breakthrough discoveries in say biotechnology, at the time they did not look much,
who tended to have them, would they have been funded under that system. Its a great example of
where we can use the growing network of bibliometric tools, to go back and tease it out
but I don't know if anyone has done that so far. With a colleague in Germany I was trying to
get funding to fund a study into the unfairness of funding and we didn't get the funding.
But we had a second bite at the apple with something called the Institute for Interdisciplinary
Studies in Budderfeld? in Germany where they had this lovely idea of inviting a bunch
of scholars to live in a forest for a couple of weeks , in houses, to get them out of their normal
environment , putting 100% of their time into collaborative work . Inviting bibliometrics
and philosophy of science people . There must be room for good work on going back
over the publication record . Knowing which research to fund is much like asking which horse
should I back in the Grand National . If you knew which to back then the problem would be a very different
one. A problem of approximate decision making . My strong suspicion is that we rely
on a bunch of heuristics that are actually a bit rubbish. We rely on the status heuristic,
that you referred to, we give the money to the most famous prof in the contest wheras, as you
suggest, he is unlikely to come up with anything new.
How does it work. If someone well known , puts their name to a proposal , if the money doesn;t go
to the people in their team?
Prersonal anecdote, supported by other's anecdotes too. If you want to succeed at grant
funding in the current tough climate , its crazy having a middle or junior person put
forward as the leader. Effectively you tend to be going around recruiting famous people.
Telling them , I've basically written this, just put your name on it and in my experience,
its an ugly strategy , but its one that works. A lot described in that model is happening
with no colaborative team building at all , but in the real world there is this frantic
round of everyone trying to get into everone else's pocket for funding applications.
Particularly when its a farce saying that academics are setting the research agenda
when really what happens as the money becomes so important to people's careers , the agenda
for what is to be studied , is set by the funding agencies , directed by the governments
that stand behind them. I nominally work in a group that studies something called complexity
science , building computer models of things. Area like eco-system services , disaster mitigationm,
counter-terrorism, ideas like this are so much in the news , at the forefront of politicians
minds that they filter down to the research councils . So , say, we need ways of combatting
Brittains growing flooding problem , combining traditional flooding knowledge
with computer modelling - you guys write something on that. All too readily we write something
on that , maybe with dubious intellectual content . When the carrot is there , people chase the
carrot. You get this odd business of people forming and reforming teams in the hope
of cashing in on each other's buzz words, as well as status. None of that helps research
We have a friend , studying superstrings , all the money was going to that . He particularly
studies something called mirror-matter . He feels there is an experiment which could be done
to establish whether there is or is not such a thing as mirror-matter but he cant get the
money for it. Because its not fashinable, money goes wherever is the current flavour. ?
I'm hesitant to say this as it sounds like sour grapes . I've had my share of the funding
pile and am very grateful for it. But that does not mean I think its a good system.
Yor right, there are frightening paralles between science and the pop charts. Very much a
flavour of the month, what is in and whats not in.
Does it relate to Kuhns work on the sociology of science? building in the cultural elements
and all the heuristics that go with decision making , as part of the different councils. What is
the likelihood of any paradigm shifts , within that cultural context? You mentioned the
stuff aboiut complexity theory , adapting to climate change , the stuff being offered had been done
a thousand times before. We've got a problem , we still haven't sorted it, lets produce yet
another model with one more decimal place after , and we will get there. ?
Its a hard problem. Even if you tried by pulling out all the human failings in the system .
You had this perfect ideal set up where everything was run as rationally as possible ,
and we did our best to fight any of these biases , then it still remains which horse to back,
to solve the explore/exploit problem. We have this body of knowledge that seems to be
promising, which upstart paradigm is going to lead to significant discovery and which is
going to be confined to the dustbin of history is really difficult. I find it hard to insidt that
our current policies for doing that are optimal.
If you've produced an agent?-based model that is culture based on the principle of normal science
with a strong competitive side where you build in the politics then what becomes the likelihood
of getting any sort of paradigm shifts. By the time Kuhns's theory work in the 1960s
we were past the 1940s and the massive expansion . Was there a difference in the kind
of funding allocation at that time?
Same as economics. A very strange state of constant growth became seen as normal
wheras in population biology and economics growth is temporary and eventually you hit
some kind of ceiling. The ceiling was hit in 2008 on that graph- resources.
You mentioned pop-charts there as a parallel , there was a problem with
pop charts , with Payola (knowing which record stores were monitored and record
companies buying their product from those stores), isn't there an equivalent in citations manipulation. ?
In a sense buying citations by deliberately loading articles with citations. You're not
necessarily getting money but getting kudos or something- a phoney structure to a certain
extent, but comes into the economics.
Certainly citation rings where people cite the crap out of each other's articles in
order to bolster each other's CVs . Journal editors unashamedly ask you to cite more things
from their journals to bolster their figures - thats true.
Earlier you though bibliometrics was possible better solution to the problem , it has
already caused these distortions , make that what funding relies upon and then that problem gets worse.?
Sure, its a tricky one. There was a nice article in the Guardian with a long series of angry
letters coming after . Got to the heart of the problem and many of those people rightly said
this is crazy , you can't sum finds in a single number , many of the great geniuses of history
would have scored very poorly on this measure. After this long series of nasty agressive
answers someone came along saying , it wa smuch better in the old days when it was just
about petty politics and favouritism , there is this sense that we have to have something
, something like the H Index ? is hugely gameable , lots of flaws. But if the alternative is a
sort of smug tenured professor giving the job to his pal then I prefer the H index. We
have to make these decisions , hiring decisions, make publications indexes. Any article
however awful will eventually get published, moving down the ladder, so we may as well
accept that . I'm not sure that journals are keeping the madness out of the stream , the
madness eventually gets into the stream - but the question is anyone going to cite it?
Some of them may not get published at all.?
They all influence one another. The other ptroblem is the attentional problem.
People only have so many hours in the day for reading . Even I rely on my students to
tell me if there has been anything interesting recently. That is not helped by massive
pressure to publish. The nuggets of knowledge are smeered out as thinly as possible.
You can imagine a world where scientists are only allowed to publish one paper a
decade - wow it would be a good paper and well worth reading. I know that system wouldn't
work either . The idea of modelling is to try and see if there is anything practical we
could do . Maybe you have to be resigned to stirring up the system every 10 years.
You look at the Research Assessment Exercise and now the Research Excellence
Framework , people get used to how to maximise their scores on those things
and start changing their activities to fit in with the measuring stick. A very human
trait but ... I'm sure we could do better than some of the measuring sticks that we
currently use. Perhaps a balanced approach to bibliometrics, hopefully.
Coming from politics , there is plans to stop London being the Libel capital of the
world. Coming into the area of science in such as Simon Singh and knocking copy/
criticism of certain individuals concerning what they consider science and he doesn't
, that all went up to the libel courts. Do you know if that is going to come to a halt as
a subset of these proposals?
If you're going to have a healthy publication environment then it should be possible
But when does criticism , become libel?
There is plagurism as well?
I've had friends who have discovered parts of their theses in articles in other obscure
jounals in the world, and angry letters to the editor. With replies such as , gosh don;t know
how that slipped through the net. If you have a system of thousands of indivuiduals
under pressure to produce these things then some will bend or throw away the rules.
What about biotech companies, have they go t patents to protect. Keeping knowledge
Indeed on the commercial side. Research itself is a terrible tragedy of the commons? issue.
Everyone benefits from scientific discoveries being out in the open , but you'd rather
not be the guy who paid for it. Governments are sometimes like that. Developing
countries don't put much into their science budgets because they consider they have
othe rproblems. Perhaps a responsibility on developed countries to stay serious on the
science. The corporate side of research is intriguing because internally to a company the profit
motive is very clear , binding individuals in that company curiously in a co-operative enterprise to
lead to a saleable drug , a new computer that works or whatever. In the wider public POV the company
defends its intellectual property , for abetter return on investment. They do fund stuff at
universdities. We have a lot of relationships with people like IBM,BT,HP where a student
might be half-funded BPSE? and half by the company and do residence with the company and
its useful for students to make those contacts. In the good times companies are pretty
unconcerned because its small change to them and they don't take a close interest in the student
so he might accidentally do something interesting. If the companies do play close attention they tend to
make sure the student is doing something that tends not to be very bold or exciting as they
would not let a student do it , as would want to keep it in house . A company's job is to keep
its shareholders happy. A notion of appealing to their better nature seems silly.
Places like HP where friends of mine have worked , dipped in and out of industry , when they're
on a roll, when times are good, it sounds a great place to work, but the labs are the first to be affected when
there is belt tightening. I see science as society's research lab . For likes of HP is makes
sense when they have some spare money, to invest it in new discoveries or their company
goes down the tube in the long run. In the wider world we cannot rely on companies to share
their accumulated knowledge , sometimes it leaks out eventually.
What you/ve described is the interplay of science and idealogy, much like the ??? economy , do
you see trends in the data that represent the phenominacal dimension of the current idealogy which
is very individualistic ? So do you see over the years a decrease in the number of co-authors
, ie less colaboration between people. ? Are there trends you can relate to cultures as I remember
people in Denmark used to produce far more papers with a lot of people on them.
You see differences between countries . It would be easy to look but I haven't looked
at the historical trends in something like tyhat. There are massive differences across
different disciplines , and countries?
Different countries and different disciplines , like in maths single author papers are
very common, in medicine 15 authors is the norm.
I was reading something last week something I was not aware of. Where academics have
come a cropper in this concept of data mining. Where a university has got full permission
to access whatever they want on Elsevier etc but if they try and find a sequence of GCATs
in gene sequencies in all the journals over all time , suddenly gedts sat on. What is going on there,
very obscure search terms, theyy've paid their multi-tens of thousands of pounds in fees for access ?
Bioinformatics is an interesting one , some of their internal tools for looking for strings
in gene sequences, they were amongst the first to start automating the literature search
problem. The problem of drinking from the fire-hose that ebverybody faces is too
many papers appearing. Particularly in a hot area like bioinformatics its impossible to
keep up, so the first to create automated watchdog tools. I'm interested in this protein, anyone
who mentions it , I want to know. It could be cock-up rather than conspiracy.
It could be as simple as Directed Denial of Service, overloading of servers. Many people would
happily host mirrors of their data , if they would let them.
If a university had a big enough computer they could download the whole lot over time, and
the publishers could not do anything about it.
I've checked some of these genome sequences and it says 100 percent complete , but when
you delve into it, its not complete , 5 percent missing?
Its hard to say with a geneome if its complete or not , as so much we don't understand, which
bits are interesting , but only for that test subject , but complete for all we cannot be sure.
You talked about game theory . Does the same apply to scientific research , the more colaboration
you have, the better the outcome, is there a statistical relationship there? You could test it
with the web of knowledge and statistics? number of citations versus the number of authors,
depending on how you measure quality, is centralised competition random.?
So if colaborative papers are more popular, its so fundamental. There seems to be a blind assumption
that the effectiveness of individualistic / competitive thinking applies to a lot of asprects
in society so scientists have no special rights about it being thrown at them. Its not unreasonable
to ask of other human activities whether competition produces the stated benefits. Its seems to
me that science , at some level, is run as a co-operative exercise. If science was run as a lot
of narrow fiefdoms, protecting data and discoveries , not sharing on a global level, we would not
be half way to where we are noiw. So the fundamental argument that it is like the prisoner's
dilemma , as in the repeated prisoner's ddileema , ensuring cooperative outcomes to the
benefit of all involved , its also subject to economic and beaurocratic mindests from many university
administrators and research councils. Accountability is a good thing, perfectly reasonable
and proper for David Willets to say if we are giving you all this money then you must show
that you are the right person to receive it, and what did you do with it. But leaping from that
to the best way for me to hand this money out is through an agressive time-consuming
round of competition , is a whole other argument.
This is what you were saying about the explosion of the number of scientists in the last
60 years and the over-proliferation of PhDs , is the answer to the question of competition
a matter of introducing a rigorous system of culling?
I strongly suspect so. Its just unethical some of the things we do at the moment . The
post-doc merry-go-round is a very unpleasant place to be. People in their late 20s
early 30s, starting families on very insecure 2 year contracts or less, often having to make
an international move , it seems its the only way to advance their career and their life is incredivbly
tenuous. It doesn't lead to good research output even. The logical conclusion is that academia
is horribly bloated and I should be out of the door along with 80 percent of my colleagues.
If society wants to invest X amount of money in its research lab , academia, then perhaps the
most efficient way to spend that money is in a much more selective and exclusive fashion.
I think you may be right. To way to have this sort of argument is through such models
and testing out different mechanisms . We get some social experimentation through looking
at other coutries research solutions . We have had a period of explosive growth may be its
simply been poisonous in its effects, and perhaps to be cruel to be kind. From a UK university's
POV, a fully funded overseas student , like myself, I just needed to have a pulse. I thought it was
a tough bar to get over. If universities see students as income then they will be mercenary.
That may not be good for the system as a whole. Maybe there are some PhDs I am teaching
would be better off driving an ice-cream van or surgeon or lawyer. You cannot have a society
consisting of all scientists, there must be some limits.
If you have 5 PhD students rather than the one that was in the past , is that purely because its
seen as a revenue stream, or are there other factors?
I think the norm for mid-career people like me is to accept the brutal truth that we don't
get to do research any mor eand have to surround ourselves with a small army of PhDs or
post-docs who do the actual work and have to be resigned to being middle managers.
You look around you and see that the most successful people have built armies of workers
around them , so there is that. Some of it is that you get funding from the research
councils for training as well as research and that changes PhD students into income in
a way. Wghen you stand outside the system and think are we doing this the right way
you have to be alive to the possibility that we are not. I have colleagues in very abstract
areas of say computer science or infomatics where they tend not to have an excess of PhD
students because they find it hard to find good or appropriate ones. In the buzz-word rich
areas of academia like complexity science ... (change of disc)
Q: Concerning the likes of Solent Univdrsity , has got very few PhDs so their remit is to
teach. Is that something that is going to happen more often. ? The othe rthing is the number of
overseas students and funding?
The future looks a bit grim. You look at the frantic squabbling and positioning Southamton
does so it can be seen as a proper Russell Group university other than some upstasrt
redbrick and you realise there is a curve there. The few that are doing ok there ar eseen as
research activ euniversities , get rewarded for that. If you fall off the cliff a bit they
take your funding away . The logical future extension of that would be a return to
the two separate kinds of institutions - polytechnics.
The government could soon take the money away so the likes of Solent would have to
get there money fron undergrads, there would be no time for research?
That is the life of a lot of people now. Its pretty brutal on someone who ended up there
with no fault of their own. You have to be superhuman to do all the teaching and then
do some research at the weekends. Maybe it is right we have two separate kinds of
Would making exams harder , reduce the numbers going to universities and some
unis would close down. Is that what government is looking to. ? A politician is all
about selling, pitching ideas to people, not whether it is right? A student must think to himself that
this degree must get me a descent jopb and pay off these fees?
If we charged them 9000GBP a year then fewer of them will be going for I want to find
out about the world. We get a lot of pragmatic job oriented undergrads , not all
of them. There are a number of bright, pure knowledge people as well , which is amazing
really. You would hope that a country as rich as ours could allow a couple of
years for people to get a perspective of the world in general, rathe rthan a narrow
focus on technical skills.
Othe rsocieties do not look at education as a way of getting as job. Education as a way
for people to conceptualise , which is different thing? The British system is set upon
Seems one of these circular traps where nobody wants to invest the time and extra
resources into developing well-informed citizens. In the long run , imagine a bunch of cavemen
in acave wondering if they should invest some meat on fire research , that was a good
move. A stripped down view of education it would not have got us where we are now.
Monday 18 Jun , Prof Jon Adams : Current scientific aspects within marine
Talk 1 hour, Q&A 1/2 hour, 15 people, coincided with football.
Science has a lot of techniques and methodologies that we can use for archaeological
purposes. So I am very much a consumer of science. I use it in the sense of running
processes and using instrumentation, not a person building it. In terms of acoustics I don't
understand the signal processing techniques . I want to give you some insights into what
maritime archaeology (MA) does at the moment as a way of answering questions.
Science can answer questions that we could not even ask just 10 years ago.
Things are moving fast in the areas of computing, acoustics, electronics , imaging
and all combining to provide a suite of tools that are becoming ever more sophisticated
and cheaper . Archaeology is not a privelidged discipline , a million pound grant
is unimaginably huge but to engineering discipline that is chicken feed.
We have an interesting relationship with other departments. MA is one of the areas
that SU considers as one of its primary focuses. We have alot of strong links with
the oceanography dept, engineering in particular ship science .
When did MA become MA as opposed to anything else such as well intentioned wreck
hunting, adventuring or treasure hunting. After the war with the reinvention of
SCUBA gear by Cousteau , it had been around since the early 19C. His engineer colleague
Emile Ganion ? was to miniaturise SCUBA using 2 technologies. One to create high
pressure gas cylinders , and more quantity for more than 5 minutes of use becoming in the Med a leisure pursuit and hundreds
of ship wreck sites were discovered - many trashed out of ignorance or tha tthe artifacts were
worth money so raised and auctioned . Archaeology was tried on some of these early
wreck discoveries. Cousteau tried one himself. He excavated a 1C BC Roman wreck at
Grande Conglue? off the south of France . He set a precedent in that he demonstrated
the new SCUBA gear was eminently suitable to do science not just MA under water.
But his archaeology was crap. He made no recordings , his divers raised things willy-nilly
and he did not realise he was excavating 2 wrecks, not one.
So the MA of the 1950s had some way to go. It changed in SW Turkey in 1960 , when
Peter Frogmorten , historian, was correspondent , anthropologist etc and a visionary.
He foresaw MA before anyone else . He spent a lot of time with sponge divers , in the
water for hours at a time with standard copper diving helmets. They were finding all sorts
of stuff othe rthan sponges, stuff they weren;t interested in. But they talked to Peter in
Isthmea etc about it. He started to catalogue the things they were describing to him.
He then went out diving with them and he created the first maritime sites and monuments
record. We call it historic environment records these days. A list of stuff that could be
important for various cultural reasons. Onme of the sites made him contact a Pensylvania
post grad called George Bass a specialist in Med bronze age trade. He was looking for work
to advance his PhD research. They both dived on a wreck in 30m of water off Cape G?
and it turned out to be a ship that sunk about 1200 BC. What marked this out, if we think
about the codes of practise that we now as professionals work to today . eg the Codes of Practise
for the Institute of Archaeologists , that defines the process from early research stage, through
data collection to post-excavation analysis to publication, display etc. Its an ethical
trajectory as much as a methodological one. This is the first time we see anyone put this
together in underwater excavation. Bass comes to the Med with research questions and an
atitude that rather than saying its 30m under water , how can I be expected to do
proper arch. , he comes with the attitude , if I'm only able to spend 30 minutes under water ,
that 30min needs to be as controlled as if it was 30min on a landsite. So he refused to make any
consessions to it being under water. That was the conceptual breakthrough.
He realised something else that the oil-companies were to do a bit later than he did. The oil
companies in the 70s in the North Sea use a lot of technicians / craftsmen to do various
jobs in diving. They realised it was stupid to teach a diver how to be a coded welder or a NDT
technician because that takes years to learn. Much better to take a welder and teach them
how to dive , you could do in 12 weeks. This is what Bass did , chuck out the old model of
real divers with a token arch. sitting on the deck. Change it all around and put the arch.
underwater as they could be taught in a couple of weeks in clear water environment.
That Turkish exc. would near enough meet current IoA code of ethics or UNESCO
2001 convention on the protection of underwater cultural heritage.
So coherently focused and academically sound part of the wider discipline of arch.
where it starts half a century ago.
Bass's attitude to doing MA also extended to try and use whatever scientific procedure
he could get into the water. A lot of things don't work under water. Electronics
and salt-water traditionally don't go together well . He was keen on using submersibles ,
photogrammetric techniques and tried many things for the first time that others later
took and developed for other sites thereafter.
I will now come over to the murkier waters of the Solent. Not long after Bass's Turkey work , in 1963,
Alexander McKee , lives in Hayling Island, conceives of a project he called Project Solent Ships.
He'd learnt to dive in the forces during the war , he becomes interested in naval history .
The Solent one of the most treachorous pieces of water in the world . He starts drawing up
a list of potential sites of interest. So things like The Royal George that sunk at Spithead
, The Edgar, The Boyne, The Invincible etc. He realised the most important was a ship called
the Mary Rose. He described it as the most important shipwreck in the whole of NW Europe.
He starts looking for it in 1963, in archival research and looking at seacharts. For several years
he does not make much headway . He realises that the engraving of a now lost painting
, made in 1777 by the then brand new Society of Antiquaries in London , had made
from thepainting. The painting was lost in a fire at Cowdrey House , but this accurate rendition
engraving shows Henry V111 sat on his horse at Southsea Castle, IOW in the background
, the English fleet on the right and the French fleet off St Helens on the IOW , coming
to engage the English and the Mary Rose had just healed over and sunk. The FRench say , of course,
that they sunk it by cannon fire, but they didn;t. It was sunk by , think health and safety,
sunk as an "incident pit" . An accident that happens because 6 things occur at once,
any one of which would not have mattered on their own. So that posistion is where he started
to look. He cleverly put together a former account of 2 brothers a John Charles and Anthony
Dean , who were based in Portsmouth , some of the first maritime antiquarians. They
were salvagers and had in fact invented the copper helmet for diving. They dsolde the patent
to an Augustus Cibby , the man who developed the copper helmet system , over the world.
The brothers used it to salvage wrecks around the British coastline . Although they started
for entirely hard-headed commercial reasons , they start seeing wrecks that are old by the
standards of their time. They start getting interested in the antiquaries under the water
rather than the stuff that would make them money. The Deans start collecting antiquaries
and have them professionally illustrated . Scale drawings, aqua-tinted , in Portsmouth
City Museum today. The Deanes were called over by the navy to blast away at the
Royal George as a hazard to navigation. Fishermen asked them to come over to a place where
they were continually snagging their nets. They started seeing timbers and guns , and in 1856
had discovered the site of the Mary Rose. They worked on it for 40 years and then it all
goes quiet and gets forgotten. McKee tracked down the records of Deane and spent weekends
looking for the MR. By 1967 they still had not found the wreck.
Then Horold Edgerton from MIT was in England to demonstrate his new matrine geophysical
survey , using sound. He had 2 systems, side-scan sonar using high frequency sound that
will bounce of the seabed and the return will give you the character of the seabed. A map of the seabed
something like photography and mapping on land. But he also had Sub-bottom sonar , using
a lower frequency which continues to penetrate, not all bouncing off the seabed. Some of the
sound will refract off the interface between every subsequent material the sound passes
through. Edgerton came over to demonstrate these systems to potential buyers.
McKee contacted him and asked him to do the demo in the Solent over the interesting
area. He did and found the MR. Any geophysists at the time would have gone weak
at the knees seing this plot. Because here is a sunken shipwreck. Some of the sound is hitting
subsea geology , essentially a timeslice of the sound travelling through the seabed.
But something is interupting that passage of the sound. Something making the sound
behave differently. There is a slight undulation of the seabed which is atypical.
Put those 2 things together and you have a shipwreck. Printed on thermal paper, with
someone with a stopwatch and every minute marking a time mark as it rolled out
of the printer. At the same time people would be using Decca or transits with sextants
taking fixes so you can return back to the same point. So the wreck was identified
but wasn't seen for another 2 years. In 1971 a McKee diver saw the wreck for the first time.
For the next 11 years a project was run to eventually excavate the whole of the MR.
What McKee's team had first seen was the very tops of the timbers. The MR had sunk
and rolled over onto its starboard side , an esturine seabed so very soft mud and the several
hundred tons of the MR had impacted several metres into the seabed.
Then environmental forces gradually errode everything that is proud of the seabed .
What is below fills with sediment and is preserved. Water and sludge is the perfect
preserving medium for organic material. Shipwrecks like the MR, much of the contents
This is where I cut my teeth in MA terms. Having been trained as a land arch. was then
transposing what I knew and had been taught to working underwater. My mentor
was Margaret Roule , the arch. director and she had the same attitude as Basse, that
any instrument , however much a prototype or unproven it was, she would contact
people at whatever university or company. So we had Marconi and several universities
with experimental kit to try it out on the MR. If it worked there then it would probably
work anywhere. If it worked for us then it made our arch. more effective, quicker and
perhaps safer. We tried all sorts of acoustic systems and imaging, things that we now think of as
new techniques - sector scanning sonar, digital photogrammetry were all tied on the MR .
Some things worked, some didn't , the ethos was lets grab anything and try putting it to the
service of arch.. To make sea arch. catch up with land arch. . We felt it was very much in its
infancy in the 70s, and still seen by some land arch. as a bit dodgy, close to the lunatic fringe
or something. Trying to match the control that we would expect on a land site.
So sound found the sites and then we used it for measuring the structure under water.
We use light to survey in air, so our total station used infra red, now use lasers , use
reflectors and convert to distance. Laser degrades too quickly underwater , we can use
green laser up to about 25m or so. We use sound and if we calibrate the effects of turbidity,
salinity and temp. that have on the sound , we can use sound accurately .
We set up transponders around the site , then with an interrogation unit , placed on an object
say, then measure the distance to 3 trisponders. When MR was salvaged in 1982 , it was lifted with
a lifting frame, attached by wires , then fitted to a premade cradle constructed to the shape
of the ship. The idea was that if we put our acoustic transponders on the alignment holders ,
stabbing guides on each corner, then the underwater transfer would be regulated via a
video screen. Unfortunately it didn't quite work out like that. One of the legs was not raised
high enough in the underwater transfer , it caught in the seabed and bent , so 3 straight legs
and 1 bent leg. So not getting 4 into the stabbing guides and requiring humans to
go back in the water . We had to cut the offending leg off . This is why the lifting frame
was at a jaunty angle relative to the cradle. Again , like the sinking, it was an incident pit of several
things happening at once. The strop on the main hook to the crane , was around the
base of the cradle , past the stabbing guide on the wrong side , so it crushed the
stabbing guide, effectively shortening the strop , on the side with now only one leg.
The pin securing that leg could not be emplaced, so the engineers put in a 2 inch pin
instead of a 4 inch pin. See how the accident is more and more likely to happen. The pin
sheared and that why the cradle ended up like that. I was the person sent on by Margaret
to see what damage had been done , and frame S8 on the starboard quarter was slightly graunched on top
which was the only place where the lifying frame had contacted the wreck, as it collapsed .
But S8 had punctured a hole in the steel tubing , so Tudor technology proved slightly
The plans from the early days trying to answer the question , how big is the MR, what condition its
ion before opening up the area, something like 50m x 25m. At the time we were taking local
measurements and writing them on a board. Then with rulers and compass, drawing them up ,
striking off arcs , building from the known to the unknown, a map of the timbers. Over a site
that big means thast compound error creeps in quite quickly and almost impossible to determine where the
erroneous measurements are. With acoustics we were able to do the whole thing in one go.
Essentially to get a best fit, so while overall tapemeasures were giving better local accuracy
, John Partridge of Sonadyn gave something like 10cm accuracy over the whole site. Reasonable
for the time of 1975. I became interested in the ship as an artifact, as a piece of technology , less interested in
the objects carried in the MR. Its big and complex and for a surveyer its awkward as there
is not a single flat surface, right angle , straight line in the whole structure. How do you measure such as this
After the MR I went to record the Sea Venture in Bermuda . That was easier as essentially
what was left was flat on the bottom of a coral gully in 9m of nice clear warm water.
So can use a range of quite basic techniques as its largely flat and can treat as a 2D problem and mosaic
together. Surveyors measure with angles so theodolytes of old still now measure angles but now
digitally, they now also measure distance with lasers. So measurement of an unknown object
relative to a baseline , and trigonmetry. On land you can do that to an accuracy of about
10 seconds of arc . So say the survey of India in 19C we find that going from one side of the
continent to the othe rthey built up only 4 feet of compound error. But under water angles
are not so accurate. An under water theodylite is accurate to only about +/- 1/4degree.
So in this room over the diagonal then about 3 or 4 feet of movement before I could mark
1/4 degree on the protractor. Far too much slop for even basic pre-disturbance plan
let alone structural points that you want to fix with more accuracy.
So under water we tend to stick with distances. We use sound for measuring distance but we still use tape
measures as for many things it is still quicker. A tape measure survey can be surprisingly accurate
if the overall structure is no more than 5m say. So a building of this size then we could expect
an average error residual, of maybe a couple of mm. We can demonstrate that by using computer
software. Although the tapes are old technology, what we do with the measurements is a
bit more clever. On the MR we started having to deal with the third dimension. The lowest part
of the hull was 8m below the seabed. Excavating down season be seasonm metre by metre
, trenches getting deeper and the average visibility in the Solent is maybe 2 or 3 m
in the summer and 10cm in spring and autums and sometimes nothing. So we quickly
got to the stage that the trenches were deeper than the visible range.
So example of taking a datum point from the top of a beam , done twise and using a
straight ruler to get the depth below the datum. All very well if you can see what you're
doing but go deeper and you're into midwater gymnastics. Trying to keep things level
and plumb over an object you cannot see . Just as we were finding the arch. was getting more interesting
and important , getting to undisturbed material well within the wreck our survey
was becoming increasingly unreliable, haard to do and slower.
You can put 2D techniques on their set via offsets . A hull profile can be determined like that
and put in AutoCad and plotted out, etc. You can build up 3D info of structures
by using a system of relative depths , can combine distances, angles and offsets and
relative depths into 1 matrix and end up with an accurate survey. On the MR most of
these techniques were hard to do because ideally you stick a tape on a data point
and go straight to the point to survey. How do you do the maths. Othe rpeople have
developed equipment to tackle this problem. A hydrolyte a sort of a mechanical version of
a theodolyte developed by Swedish road engineer Elize Lunden? . Because Swedish roads
often go between islands requiring bridge pilings in the water . So he needed equipment that
he could reliably posistion XYZ material on the bottom. He invented a taught wire system
a horizontal protractor , calibrated staff that will read off as the wire goes out , so 3 values
that will give your 3D posistion. Horizontal angle, verticle angle and distance out.
No quantifiable error but it works. The Norwegians came up with anothe rmethod which
used vector maths . If we put in an array of datums , made sure they're absolutely on the
same plane , bubble levels etc. Then take slant ranges ot the point we need to survey
and then vector maths. Each of those measurements is the radius of a sphere , centre
of which is the datum point and then 2 points where 3 spheres intersect and decide
which one you will use. 3 values give the 3D fix but not the quality of the survey.
One or more could be way out and you will still get a fix in space.
So Nick Roule , without knowing of the Norwegian effort, came u[p after talking to
his Cambridge maths professors , introduce a fourth measurement that then gives 4 answers
, 3 extra answers clustering in 3D space somewhere around the true position. The magnitude
of that cluster will give an idea of the quality of the measurements. Put it all in a computer
program. So datums all ove rthe place so could then take relatively short distance measurements
each time. Speeded it up, made it easier and the accuracy went up by an order of 5.
These days we use a least-squres algorythm to quantify the quality or we use something like
nearest neighbour analysis in statistics.
This new software I will be using in Bermuda in a
cou[ple of weeks time. One side of an English wreck that sunk in 1619 one of the earliest
colonial wrecks , on its way to Jamestown. Unusual for tropical waters to have this much
of the side structure preserved. It was blown against a cliff in ahurricane , tipped over
on its side and so that was preserved. We have enough data there to reconstruct the hull
if we survey it properly. So why would we want to survey to 2mm accuracy.
In arch. terms , often we don't want to , not accuracy for accuracies sake , we want it to
tell us something that we otherwise cannot get. To reconstruct and analyse its performance
, to compare with othe rships. Were they building better ships by 1619 than the MR
for instance. So reconstruct digitally and we could do that. So we end up with a matrix
of measurements that we can export to AutoCad and then take in different directions.
We can use that 3D armature , to project conventional plans on . We can produce fancy
drawings . Or something more specific like the 1609 wreck of the Sea Venture , historically
significant as that is the point that inhabitation of Bermuda starts.
When the story of that wreck got back to England in 1610 , Bermuda uninhabited was known
as the Isle of Devils , description of the storm such as St Elmo's Fire in the rigging, terrified
crew chucking stuff overboard, getting washed ashore on a mystic island. That first-hand account was given
to the backers of the Virginia compoany, one of whom was the Earl of Southampton ,
patron to William Shakespeare , who then wrote The Tempest in 1611.
We don't have much info on the design and construction of the ships of that period arounfd
1609. Ship building reflects the cultural changes over the period
1400 to 1600 . So the aim is to survey it with enough accuracy to analyse its performance
and reverse engineer what went into the hull design. We know ships of this time were designed
to certain geometric principles , from some surviving manuscripts , so maybe some of them
will actually compare . This we did and number 9 in the Admiralty Library is uncannily
similar to the shape of Sea Venture. So with my colleagues from the dept of ship science
at the uni , they have software that is designed to predict the performance of say roll-on-roll-off
feries , lifeboats, Americas Cup yachts etc. From reconstructed digital line-plan you can
then find its spee, drag coefficients, stability etc. The way that program worked allowed
me to mimic how a 17C shipwright would have worked. The shape of a hull like SV looks
like a complex curve but is a series of tangents of arcs of circles. The ship builder assembles
by shifting the centres of the various radii. So how to shift the centres in a controlled fashion
and able to do that with ShipShape . Hence the lines-plan that we could export to some
othe rsoftware which tells us something of the performance qualities of that hull. We
know that SV survived 3.5 days of a hurricane before hitting the ref. It didn't actually sink, it
lodged in the reef and the crew had to get from there to the shore. Performance analysis
shows that SV has a very stable hull. The relationship between the centre of buoyancy
and the centre of gravity produces a very stiff and powerful righting moment and so for the
standards of the time would be a very good sea-keeping ship. Better than the MR
in fact which was also analysed in that dept.
Another example of accurate survey leading to other effect is the reconstruction of Viking ships
at Ross Guilder . the remains of 5 Viking ships each different and 1:1 recionstruction. On going testing
of these in different conditions. We now know a lot more about Viking seafaring . You can
do a certain amount of models and tank-testing , but if you can afford to build 1:1
then you will get data that you cannot get any other way.
Example of a stereo viewing system developed between us and the USA Woods Hole
institute for underwater stereo photogrammetry software which will allow us to compute
the 3D position of anything in the resultant photographs - thats the goal. Like a
Since 1995 have been comparing the goodness of different acoustic systems
and decided to buy one made in Norfolk . We tested on a number of sites including the MR
but also the Grace Dieu of Henry V at Bursledon on the River Hamble. Reportedly struck
by lightning and burnt to the water line in 1439. Conspiricists say it wa sset fire to , to get
all the iron nails out of it - about 23 tons of iron. The official story was the lightning one.
Essentially it was mothballed in a mud berth . The lower hull , below the mud is still
there. The GD begins in 1416 completed in 1420 makes one only voyage around
the IOW in 1421 during which the crew mutiny and that was the end of its service. The mutiny
was for political reasons, nothing to do with the ship as far as we know. But it was a
monster , nothing bigger for 200 years, 1400 tons. The uni of Soton bought the GD
from the MOD for 5 GBP in 1971 so a stewardship obligation. We survey , monitor
and excavate occassionally. Originally thought to be Saxon or Viking etc. But
Michael Pring ? and R C Anderson in 1933 carried out some low equinoctal tide excavation,
sort of 6 in the morning or evening. To build such a larger ship with clinker construction
the wrights had to device technoques that are still none to clear. Surveyed in 1989
and we went back and resurveyed with GPS, acoustic sub-bottom profiling and
quite a high concordance of the 3 surveys. We were interested in using a range of sub-bottom
techniques . We are trying to get a system that sees in higher resolution, more reliability
and can tell cultural material from sediment - quite a tall order.
Here is riverbed, here is sub-mud geology and bright reflectors of the buried hull. We
go back and forth over the buried hull to create slices a transit. So obtain amplitude
map , indication by colour , where the brightest reflection , most sound is reflected.
We can use that to create a seabed 3D model . That was the state of our art up to about
5 years ago . My colleagues built a 3D sub-bottom profiler. Uses the same transponders
picking up returns on a bedstead array rather than singly. The array is fixed to 4 GPS receivers.
So for every ping we have RTK real time kinematic and a 3D model directly.
We can use that to reconstruct lines . Perhaps not as accurate as the lines I was able to
construct from SV and the actual structure , but its a start.
A quick rattle through what is a ppearing in oceanographic survey. Side-scan trace
from ROV , stern of a 17C Dutch ? still with the carvings. Showing the preservation
that is characteristic of the deep waters of the Baltic. The Varsa is by no means an isolated
case. The preservation in the Baltic is likely due to its very low salinity and the creatures
that eat organic material , including wood, they don't live in the Baltic. So
wrecks of 3 or 400 years ago , sit on the bottom and still look like ships, including
some with still their masts up. Using a combination of side-scan producing an
aerial survey and sub-bottom profile , penetrating, a swathe-symetry ? system , designed
so that in real time produce a 3D topographic map so can use it to survey wrecks quickly
and accurately. If we get the frequencies right we can send the sound through the
structure and so internal structures.
Q&A, recording failed
Please make emails plain text only , no more than 5KByte or 500 words.
Anyone sending larger texts or attachments such as digital signatures, pictures etc will have
them automatically deleted on the server. I will be totally unaware of this, all your email will be deleted - sorry, again
blame the spammers. If you suspect problems emailing me then please try using
my fastmail or my fsnet.co.uk account.
If this email address fails then replace onetel.com with fastmail.fm or
replace onetel.com with divdev.fsnet.co.uk part of the address and
remove the 9 (fsnet one as a last resort, as only checked weekly)
keyword for searchengines , scicafshadow, scicafsoton, Southampton Science Café, Café Scientifique, scicaf, scicaf1, scicaf2
, free talks, open talks, free lectures, open lectures ,