The BBC have revisited the subject and have a nice quote from Peter Knight:
"Every single programme we looked at was worthwhile - there was not a dud amongst them," Sir Peter Knight, head of the panel which compiled the lists for the Science and Technology Facilities Council (STFC), told BBC News.Those are nice words, and useful as well when countering accusations of low quality science that I am sure will appear as the bunfight intensifies over the next couple of weeks.
We humble STPers get a mention later on:
Some scientists believe the list has been fudged and that certain areas of science have fallen through the net. For example, all ground based solar terrestrial physics facilities have been earmarked as "lower priority", described by one researcher as "absurd".Not my quote this time. The reporter hits the crux of the problem a little further down:
The reason for these oversights, some researchers believe, was that the STFC advisory boards were too small (the PPAN board consists of eight people) and therefore did not represent every area of science they were making decisions about.How very true. PPAN were already at a disadvantage having received zero input from the wider community they then have to decide on the ranking (bunch of rankers anyone?) without more than marginal (at best) experience in some areas of science.
This is like asking a group of 8 biologists to decide whether particular instruments (and not even the science) are important in the field of computer science without any guidance and according to a strategy document that no one agrees with. As the current ranking stands we have three gravitational wave experiments and no solar terrestrial physics.
Anyway, the Guardian has another piece as well. This one kicks off nicely with Merlin. Of course a lot of the lower priority projects have caveats associated with them and so will be funded anyway, that means that some folks who think they might be safe might have a rude awakening come April. By the way has anyone ascertained why the programmatic review must take place every two years? I have seen it asserted but no real explanation why such a timetable exists. It hardly seems like an efficient use of resources. One could be forgiven for thinking that programmatic reviews will continue until all projects that some folk want closed, are closed. Then a rethink might be in order.
Demands for transparency in the consultation process have been covered in Research Day UK.
Let's see how that goes for us. Considering that the embattled STFC CEO, Keith Mason, has been widely quoted as saying that he wanted to make the peer review process more transparent he has done far from a sterling job on that front.
The RAS has welcomed the consultation but is dismayed at the rankings and the short time-scale. Don't they realise that if we had longer then there might be a danger of real engagement with the community and other solutions?
That would not fit with the grand vision of STFC (the new model councilTM).
The RAS seems to think that STFC is listening to them, I am not sure why considering the pitiful response to their statement. Perhaps the RAS felt it was their intervention that helped with Gemini; that's nice for the the Gemini astronomers, and I am happy for them, but perhaps now the weight of the RAS could be bent to support some of their other members.
Worryingly it seems as if the RAS might have bought the STFC spin about ground-based STP. Perhaps they should check out the last evidence session.
I am sure more evidence is winging its way to the select committee to refute one or two points that were made that day.
I almost forgot. This humble blogger (along with some others) has been quoted in a post on The Great Beyond blog at Nature.com.