I understand the perspective you're taking, but I politely disagree. I think you're missing the bigger point here.
Let's say someone is interested in baking a cake. There is a LOT you can learn, spanning general baking techniques, chemistry, design, art, tasting, etc. But if your immediate response is "we need flour, so go plant some wheat and wait a few months," they would likely lose interest.
Teaching people to plant and harvest wheat is awesome, but for most people it probably shouldn't be the first thing you're met with when you are trying to learn how bake a cake.
If you're studying "Baking Science", which covers everything from the beginning to the end, starting with "Let's grind some flour" is a good idea. People studying "Baking Science" need to understand the whole process, rather than believing everything starts and ends with pre-packaged recipes and machines that do everything for them.
That's what's going on here. A "Baking Science" curriculum that didn't impart people with a knowledge of where flour comes from and how it's made would be a joke.
There may be space in the whole curriculum for that, but it's probably not essential here and now. The whole pull of Bret Victor's presentations is that they show us what it would look like to program if our tools were as modern as Word.
The baking-metaphor problem is that you have students who are supposed to come in and investigate how the arrangements of toppings on a pizza affect both nebulous qualities (like deliciousness and heterogeneity) and rigorously measurable ones (like moisture and elasticity) of the pizza crust. However, when they come to your kitchen, usually most professors put them in a totally new room which contains millstones and grain and milk ready to be turned into fresh mozzarella, with nothing labeled. There are reasons for this -- real pizza aficionados have very different choices about how they want to compose their sauces and which cheeses they want on the finished result and even what leavening agent causes the dough to rise, so the framework for pizza-baking is as general-purpose as possible. But those reasons make things difficult for the newcomer.
The professor is just saying, "when we start, I walk everyone through the process of finding the flours over here, the additives over there, and using the bread-machine to mix them and knead them. I then show them where to find the canned sauces and the pre-grated cheeses, so that they can start with minimal knowledge baking up some pizzas for science. Our concerns are very high-level and I want them to be fussing with baking times and topping arrangements, but so many of my students seem to be stuck on trying to turn milk into mozzarella."
The problem he overlooks is that we do not want our tools to be "as modern as Word". We want modern tools, but not modern as Word defines it.
There are excellent solutions for his problem. Distributing a pre-configured VM is a good one. Instead, he wants students to have the experience of bootstrapping, but he also wants it to be painless and magical.
But instead of looking at his actual problem, he's wound up railing against all the critical freedom that makes the field something other than a glorified exercise in painting by numbers.
EDIT: For the record, turning milk into mozzarella is actually really easy and quite suitable for a novice. I've done it. Takes about an hour, end to end.
My understanding is that this is not so much for a single course (where a VM would be a good solution) as for general student research. Prepackaged works less well there (though may still be made to work).
> If he decides to ignore all the tools (puppet, chef, scripts, etc.) designed to make all of this easier, that's his fault.
You're kidding here, right?
I find Puppet and Chef superconfusing and not worth the effort to learn at my job right now, and I'm a fucking programmer by hobby and profession. This is exactly the kind of bullshit people doing science should never have to deal with.
Puppet, at least, is pretty straightforward. You are describing what you want your system to look like. Puppet takes that description and makes your system look like that.
People doing science who want to use computers should expect to have to learn a thing or two about using them. As in more than using Word if they want to do complex, custom, not-done-before tasks. Much like people who want to do novel things in chemistry should expect to learn more than how to make black powder.
This guy is upset that novel things haven't already been thought of and planned for by the people who make shiny GUIs. This is a farcical position. If it's really that novel, of course nobody's written a GUI for it.
Point is, tools to address his problem already exist. He dismisses them, because they don't do it in an arbitrarily flexible and powerful way while still being infinitely iTunes-y.
> People doing science who want to use computers should expect to have to learn a thing or two about using them.
A thing. Or two. Not half-a-year-worth of full-stack dev education many.
He is not complaining about having to learn things. He's complaining about having to learn irrelevant things. Infrastructure. He wants to make a soup, and he's being asked to run his own plumbing to get water, and to drill his own gas for heating. And people here are saying he should stop complaining, because nobody is making him build his own drill - it's already provided via Puppet script in a Git repository.
He doesn't dismiss tools because there ain't iTunes-y. He dismisses them, because to use those tools he has to learn more tools, for which he has to learn even more tools, and all that effort is throwaway, because the next time he will need to learn different toolchains (or should I say, tooltrees with stupidly high branching factors).
> Puppet, at least, is pretty straightforward. You are describing what you want your system to look like. Puppet takes that description and makes your system look like that.
It makes sense for a team of web developers doing high-scalability applications. It is bullshit for a researcher who just wants to crunch some numbers with a bit of Python code.
There's the problem. He doesn't understand what the proper bounds of relevance are. He can't see how a given task is relevant, so it's bullshit. That's more a comment on the limits of his thought processes than anything else.
He wants to do novel things. This means going places where not everything is preconfigured for his pleasure. It also means he needs to know how to use his tools, because when he runs off the edge of what point-and-drool does for him he will need them.
He asks for a world where point-and-drool covers everything. All I can say is that what he asks is impossible for what he wants.
The analogue in CS would be to give the students a pre-configured environment if you want to deep dive into a specific topic before teaching how to set things up. If the students are using a known OS, all you need to distribute is a shell script. Otherwise give 'em a Vagrantfile or something.
That being said, you'll want your students to understand their tools sooner rather than later.
To be fair, unfortunately as of 2014 considering there are 10^2 main disciplines, with 10^4 main research areas, with 10^6 different things to learn and 10^7 researchers building stuff to teach, expecting a GUI button to do those myriad of functions in diverse ways would be naive. Maybe, the primary school education tools are close to what you/he describe in simplicity of use.
Let's say someone is interested in baking a cake. There is a LOT you can learn, spanning general baking techniques, chemistry, design, art, tasting, etc. But if your immediate response is "we need flour, so go plant some wheat and wait a few months," they would likely lose interest.
Teaching people to plant and harvest wheat is awesome, but for most people it probably shouldn't be the first thing you're met with when you are trying to learn how bake a cake.