What is the Raging Debate in your area of geekery?

Just exactly *how * underpaid are the people in my profession?

Teacher or Nurse? :slight_smile:

Oxford vs. Cambridge?

Answer: Oxford, while clearly better, has the misfortune to languish in most academic league tables (especially American-compiled ones), as a a result of it’s lower funding. However, at least we (er, Oxford, that is) have a real city rather than a marshy hamlet attached to our university.

It’s just under eleven weeks until Easter and I’ve recently been engaged in some heated debates about which version of the traditional Roman Rite Easter Vigil is better: the 1955 missal or the 1962 missal. Do you want 12 lessons or 4? A litany sung in two parts or straight through?

Esoteric, I know.

Nothing like lawyers to generate raging debates. First an example from the extreme minutiae end of the spectrum.

It is obviously possible to commit a crime even if you aren’t directly responsible for the occurrence in question - if Tony Soprano tells someone to whack someone else, he is guilty of moider, even if he didn’t pull the trigger.

So far so good.

There is an area of law dealing with what happens when two people undertake a criminal plan (like a burglary) and in the course of execution of the plan, one of the Dastardly Fiends does something that, while not exactly within the plan, wasn’t wildly unexpected, either.

For example, A and B plan to burgle a house. While both are in the house rummaging through separate rooms, Burglar A comes across the owner, and kills him without Burglar B actually being present.

Much of course depends on the detail of each case, but (in my neck of the woods) the test for B’s culpability is whether the killing by A was a “probable consequence” of the plan they had both participated in.

Here is where it gets absurd - what does "probable consequence"mean? More probable than not? More than a mere remote possibility? It has got so exquisite that right now, there are arguments about whether it is an appellable error for a court to direct a jury that a consequence is “probable” if it “might” occur, as opposed to whether it “could” occur.

Jeez. No wonder lawyers are the subject of so much ridicule.

Ignore this please. Firefox apparently has a limit on how much text fits in an edit box, except when it doesn’t. The post below is what I intended to post.

You want narrow and incomprehensible? Here’s a taste of what Lispers have been fighting over for the past few decades:
[ul]
[li]Lisp-1 or Lisp-2: That is, should function names reside in the same namespace as variable names? Scheme is a Lisp-1, Common Lisp (and a lot of old dialects) is a Lisp-2. In a Lisp-1, anything named ‘list’ is the function list and you can’t use that name in your code without screwing yourself. In a Lisp-2, you can use any name in your code because you need a special function (hidden behind a special read-macro syntax) to access the function-named-list as opposed to the variable-named-list in any context other than a function call (that is, something that looks like ‘(list 1 2 3 4 5)’).[/li]
Every other language on Earth has a single namespace, like Scheme.
[li]Should macros be hygienic: Scheme has what’s called ‘hygienic macros’, which means that it isn’t possible for a variable in code surrounding the macro’s expansion at compile time to be captured in the macro’s expansion. Common Lisp has no such feature. Schemers think Common Lisp macros are a disaster waiting to happen, and that the precautions Common Lispers take against inadvertent variable capture are ineffective. Common Lispers think Schemers have overly constrained their macro facility and taken away the ability to do some really neat things, such as the anaphoric macros Paul Graham defines in his book “On Lisp” (which is a wonderful way to learn Lisp in general, and Common Lisp in specific).[/li]
No other language family has macros. Below I’ll discuss some implications of that.
[li]Is Dylan a Lisp: Dylan is best described as ‘Scheme with Syntax’. In other words, it looks more like C or Algol than a bunch of nested parenthesis with the occasional keyword or number. It works a lot like Scheme, though, except continuations in Dylan have dynamic extent but indefinite extent in Scheme. (What’s a continuation? See below.) This pits people who think homoiconicity is fundamental to what a Lisp is against people who think homoiconicity is very nice but other features matter more. Homoiconicity means the program’s source code is represented in the primary data type of the language itself. In Scheme and Common Lisp, source code is all lists and those languages have a lot of functions that make list manipulation very easy. In TCL and TRAC and M4, source code is strings and those languages do a real number on strings. Homoiconicity means the language can give you macros (TCL, TRAC, and M4 don’t, but Scheme and Common Lisp do), which means programmers can trivially define minilanguages that leverage the full power of the underlying programming language. (Again, more on this below.)[/li]
No other language family takes advantage of being homoiconic, and the vast majority of language families aren’t homoiconic to begin with. This never comes up for anyone else on Earth.
[li]Is Scheme a Lisp: Yes, there are Common Lispers who deny Scheme is a Lisp. They rant and rave about some of the things I’ve already outlined and some points even more obscure, but they really seem to be angry that Schemers refuse to fall in line and make Common Lisp the only Lisp that still matters.[/li]
During the 1970s and 1980s, Lisp was fragmented into a lot of individual dialects. Making code run on more than one dialect at a time was frustrating, even though they all look pretty much the same. Maclisp (Nothing to do with Apple. MAC stands for “Machine Aided Cognition”; Project MAC was an AI project in the 1960s.), Zeta Lisp, Franz Lisp, T, NIL, Lisp Machine Lisp, and many others roamed the Earth back then. Most of them died off in the “AI Winter” of the 1980s, when the “AI Bubble” burst and companies were less willing to spend millions trying to make computers recognize the difference between their normal coffee and Folger’s crystals, but not before their uniqueness was assimilated into Common Lisp (the first object-oriented language to get an ANSI spec). The last holdout is Scheme, hence (IMHO) this evergreen holy war. Lambda, the Ultimate Political Party by Kent Pitman is a fascinating meditation on what it means to be a Lisp.
[/ul]

So, what do I mean when I talk about ‘macros’? Well, in a Lisp (except Dylan, or perhaps not since Dylan has gone its own way with something similar to macros) all source code is a list or a sequence of lists. This means that a Lisp function called at compile time can take an arbitrary amount of source code and transform it into another arbitrary block of source code using standard tools (hidden behind clever read-macros, which work like what C++ programmers call macros: simple text substitution, more or less) in the core language. Macros can be recursive and they can call each other as well; the Lisp compiler won’t finish up until the last macro has returned code that contains no more calls to macros.

Talking about this makes C++ programmers’ heads hurt. :wink: They think you’re either talking about something like the C pre-processor, which is laughably underpowered, or C++ templates, which are subtle, (relatively) powerful, and make the compiler emit a huge stream of unreadable error messages the moment it stumbles over one it doesn’t quite understand. Once they understand it a bit better, they think Lisp programmers spend all their time writing compilers, and they think Lispers are off in the head, because in C and C++ writing compilers is rather tedious and specialized work, not something you undertake in the average program.

When a Common Lisper lets slip that the Common Lisp Object System is simply one huge system of macros, the C++ programmer wonders when unmedicated psychotics got access to real computers. In C++, the object system is bolted onto the core language, which is a heavily modified dialect of C (it is much more anal about types, for one thing). In Smalltalk, a language much closer to Lisp but not quite Lisp-like, the whole language is a thin varnish over a very powerful object system. In neither language could you write on object system in the language itself. In Lisp, this kind of language building is much more common. What’s more, this kind of language building actually works: The default assignment operator in Common Lisp, setf, is a middling-complex macro that knows enough to Do The Right Thing when you’re trying to assign to an arbitrary location in a complex structure.

Finally, what’s a continuation? A continuation is an abstraction representing ‘the rest of the program’, packed into a function which (in Scheme) can be called at any time. When the continuation is called, the function that captured it gets resumed like nothing ever happened. The function can then change the state of its local variables and the continuation will remember it. You can build any control structure you want out of continuations, from goto to complex back-tracking like Prolog implements. Continuations can be called an arbitrary number of times, and they always restore the current state. This is powerful and subtle, and it breaks brains with reckless abandon. Here is a newsgroup post on the subject, and here is the first page of the chapter in “Teach Yourself Scheme” on the subject.

All of the above is to show why Lisp has had such a hard time gaining mainstream acceptance even as language designers have re-invented a lot of its features over the decades: It’s difficult to sell a programmer on a feature he’s never heard of and that languages he’s familiar with don’t support. He tends to see its differences from what he’s used to as downsides, and to fail to understand any of the benefits it brings. Paul Graham has written a lot about this problem in his essays and I won’t repeat (more of) his work here. Instead, I’ll give links:
[ul]
[li]What Made Lisp Different: Contains a list of features once unique to Lisp. Believe it or not, the ‘if’ statement as we now know it was invented in Lisp.[/li][li]What Languages Fix: Languages are usually designed to fix some perceived flaw in the current generation of languages. This page (somewhat facetiously) describes what some common languages were invented to fix.[/li][li]Beating the Averages: His section about “The Blub Paradox” is precisely what I was talking about above.[/li][/ul]

Nouvelle Cuisine: Most known for huge plates, tiny portions and giant towers, there is debate about whether Nouvelle Cuisine had any real substance and lasting effect or was merely a pretentious fad catering to rich yuppies. Personally, I think any movement is going to have it’s crappy practitioners and the abominations of nouvelle cuisine were grossly overstated. Some of the trends of the movement that continue on today were great (fanatical detail to freshness and quality, more purer and intense flavour combinations) while others I think went too far (the complete banishment of flour thickened sauces).

Avante Guarde Cuisine: aka Molecular Gastronomy, has become the new trend in modern haute cuisine which aims to add a scientific rigour to cooking by adding special purpose chemicals and new techniques. Critics say that practitioners are merely apeing Industrial Food chemists and coming up with bizarre flavour and texture combinations that are merely novel for the sake of being novel and that they detract from the naturalness of foods. Fans say that they are merely taking the most natural route towards coaxing the desired flavour and texture elements out of foods. My personal opinion is that I think the entire approach towards cooking has leant many valuable techniques but not much of it is applicable to anything outside of haute cuisine. I think many of the methods, even by some of the top chefs in the area have dubious merit. Dinner as theatre, upscale comfort foods, foams and deconstruction I think are hugely overrated, “caviars”, “raviolis” and thermostable gels are interesting concepts but it’s hard to see them gain deep penetration, sous vide, low temperature long time cooking and meat glues I think are amazing and I’m looking forward to seeing how they will change cooking.

Japanese vs Western Knives: Western blades, primarily made in Germany by Wustof and Henckels are made from a high carbon steel and rely on their weight and thickness to power through foods. Japanese knives are lighter, thinner and made from a harder, high vandium steel which leads to a sharper knife but one requiring much more maintainence. Adherents on either side swear by their choice and it’s a matter of using the right technique to fit the knife so switching can often seem alien. Personally, I think Global has done a lot to harm the perception of Japanese knives because, unless you have triangular hands, the handles just seem damn uncomfortable to me. Global’s look sexy which means they appear on TV a lot and so rich yuppies rush out to buy them and then never use them.

Baguettes vs Sourdough: One one side, the perfect bread is the French Baguette yeast risen and with it’s rich, creamy interior and crisp crust. The other side holds the San Francisco style Sourdough to be the perfect bread with it’s rich, complex flavour and robust crumb. There are people who plump for other options of course but the Paris vs SF debate can get increasingly esoteric with talk about protein levels, ash content, levain vs yeast risen, kneading, microclimates, water quality etc.

Grass fed vs Grain Fed beef: Corn fed beef has more marbling which gives it a more tender and luxurious texture but detractors claim the flavour is bland and watery. Grass fed beef is often leaner and tougher and the quality is much more variable due to variations in diet and it’s often described as “minerally” or “gamey” by people used to corn fed. Coming from Australia, I fall heavily on the grass fed side as I find corn fed beef to be rather flavourless but the rarity of grass fed beef in America means that it’s very pricey.

Foie Gras: Detractors accuse foie producers of engaging in excessive animal cruelty and barbarism while supports say that foie ducks actually lead relatively decent lives and, if anything, we should be focusing on industrialised chickens rather than foie ducks… plus, it tastes really fucking good. Personally, theres so much BS propoganda from both sides that it’s difficult trying to figure out what the truth is… plus, it tastes really fucking good so I’m going to continue eating it.

Brining: Brining is the practise of suspending a piece of meat in a salt solution for an extended period of time so that osmotic pressure causes the cells to draw in the salt water. Brined meats are jucier and can withstand more overcooking but detractors say the texture is compromised. Both sides seem to agree that pre-brined meats sold at supermarkets are an abomination and nothing more than a money grubbing trick by evil conglomerates. While I can certainly see their point, it always seemed slightly hypocritical to me.

[voice with attitude]

Oh, see - NOW you’re just getting in to it!

[/voice with attitude]

:smiley:

Derleth and Shalmanese - my work here is done - great posts! You are both hereby tasked to start your own threads dedicated to the specific areas for your Raging Debates. Seriously - each of you have listed topics and full sets of debates that look like they could sustain full-on threads themselves.

Part of me wants to engage now - for instance, Derleth, I happen to think that there is a bold line drawn between what the “common man” will do and what technically-chemical fancy foods you can get at the restaurant El Bulli, known for gels and foams and what have you. Food as Theater will simply never cross over to anyone’s house, in ways that say French cuisine transformed many houses in the US in the 50’s. And as a San Franciscan by birth, trust me, there is no debate over bread!

But - this thread is more to frame the debates not engage in them - that would be hijacking! :slight_smile:

Well, dinner as theatre actually has it’s roots in home cooking and the avante guarde tradition has been about trying to bring that same degree of intimacy into restaurant cooking. If you take a look at any 50’s cookbook, the trend of carving or morphing common foodstuffs into cutesy scuptures of animals, plants or abstract shapes and the ubiquity of tableside flambes and fondue parties was very much dinner as theatre. Of course, it was very tacky theatre and so became prompty buried but several of the concepts are being revived and given a sophisticated shine. The idea of a single, pre-determined menu driven by the chef rather than a la carte by the diner is very much in the same spirit as well as the idea of paying attention to all 5 senses equally rather than just sight and taste through the integration of sound and aroma specifically into dining (for example, alinea uses scented pillows and smoked leaves to create items that are meant to be smelled but not eaten. Some other restaurant I’ve forgotten the name of is experimenting with a phased speaker array that allows them to beam specific music to individual diners synchronised with the dishes).

As for gels and foams, I think those are actually the easiest for the everyday cook to integrate into their routine. Good chefs already use a lot of different gels and thickeners in the kitchen already and are familiar with the concept of using the right thickener to achieve the desired effect. Any self respecting chef should have at least flour, cornstarch and gelatine (either pure or in demi-glace form) on hand and will commonly have potato starch, arrowroot starch and agar agar. Adding tapioca maltodextrin, lecithin, xantham gum, guar gum as well doesn’t represent much of a shift. Most foams as well is purely a matter of having the right sort of stabiliser and an immersion blender on hand. Carrot foam can be made from 100% carrots with no esoteric chemicals neccesary.

Wow, I thought homoiconicity was the entire point of Lisp. I utterly do not see the point of a Lisp without homoiconicity.

I totally agree. If Dylan is a Lisp, then Perl probably qualifies as well and you could make strong cases for Ruby and Python. Not that I think anything that is not a Lisp is bad or even less powerful (in a given domain), but names ought to mean something.

…and I completely referenced the wrong poster when I added a comment about cooking - sorry!!

Oh - and here’s another on, but from Book Collecting: Completists vs. High-Spotters. Completists typically pick one thing - a specific book that they collect every first edition in as many languages and/or published versions (e.g., limited editions, etc.) or authors (complete runs, magazines with stories, etc.). High-spotters just go after “high-spots” - the best books in their mind, across authors and genres.

For decades, many (most?) collectors were completists - or at least that is how the collecting community regarded it. Starting a 15+ years ago, High Spotters have become more dominant - but some in the community feel like this is a bad thing, because they think High Spotters decide what’s best in terms of what is appreciating in value, etc.

I am a High-Spotter, but do only focus on books I love. So for example I have Cannery Row, not Grapes of Wrath - because I prefer the warm Steinbeck novels, not the heavy, depressing ones…

Quina Quen: male or female? One of the major focuses of this argument is a certain piece of equipment and who can equip it, and for such a completely irrelevant and minor part of the game it attracts a lot of passionate responses (it used to be Kuja who was the subject of such debate, but nobody seems to care anymore).
There’s also the question of who is the more deserving of a place in the active party: Garnet the summoner / white mage or Eiko the white mage / summoner.

You mention Lisp and Haskell in the same post yet neglect to mention static vs. dynamic typing! The ultimate flamefest :stuck_out_tongue:

It’s already starting. Look at where C# is heading. It’s moving towards become a functional language on the sly.

The effectiveness of the ARVN in the Vietnam war.

I’ve been reading the same book at work for on average 3 hours everday about the Vietnam War, made a game about Vietnam, and read about 3 army journals and accounts on the effectiveness of the ARVN, and the raging debate I have is as to whether the US could of just did what it did in 1972 onwards, rather than committng ground troops. I recommend the book 'A Noble Cause?: America and the Vietnam War’ by Gerard De Groot. Excellent accounts, balanced opinion and facts to boot.

For Seattleites, should we fix the viaduct or replace it with a tunnel, or some other solution?

What kind of bridge should replace the 520 bridge?

Shoes = good or shoes = evil?

Handmade or keg?

Natural Balance or traditional?

Roleplaying Games:

The great theory axes: Narrativism- The core of the game is creating a shared story. Strongly narrativist games resemble more round-robin storytelling than D&D.
Gamism- The point of the game is to be a game, that is, the players compete with one another and/or with the obstacles the GM throws in their path for some kind of victory condition. D&D is a pretty gamist game, especially if “winning” is more important than what would make the best story.
Simulationism- The heart of the game is to model reality (or sometimes, reality as it appears in some narrative genre) as closely as possible. Simulationist games usually have the heaviest rules, the better to model the world.

The argument isn’t so much about which is better, but what’s the best way to use these terms, how to precisely define them, and why anyone should give a flying fuck.

Just mentioning White Wolf’s Mage:The Ascension/Awakening games is enough to start a flamewar right then and there. It’s deeply appropriate too, since the first versions of the game were about dueling philosophies.

Martinis: Shaken or stirred? Gin or vodka? How much vermouth? Just what is a “martini”, anyway?

What is the origin of drink <foo>? (martini, margarita, etc.)

Just what is a “cocktail”, anyway?

Rye vs. bourbon.