What is the Raging Debate in your area of geekery?

Since it’s animation, so you’re not watching real people’s lips, it makes sense that dubbing shouldn’t be a problem. I much prefer the idea of watching dubbed anime to subtitled, except for the fact that the vast majority of the voiceover work sounds so shitty. Why is that? I assume that we’re dealing with professional voice actors - why can they not just read and deliver their lines like real, normal stage actors? Why must it sound so cartoony, exaggerated, and ridiculous?

Not necessarily. When I was in animation school at Sheridan College, we learned that mouths are often animated to the way the real mouth moves when speaking the words, and the rest of the animation is timed around that. The actual drawings may correspond to the actual mouth movements to various degrees of accuracy, but the timing will match the original words.

Western animation, yes. But in Anime the mouths are just animated to move in general (at a regular pace, even) while the character is talking.

Biggest controversy at the moment is not beyond the comprehension of outsiders, although some of the interwoven reasons that it’s a huge concern is: FileMaker Inc is going to do product activation. Why it’s so controversial: FileMaker is a multi-user product and the developers of database solutions (folks like me) generally do installations and upgrades at facilities on tens to occasional hundreds of workstations; facilities running our solutions constantly have machines go out of service to be replaced by others. The developer community is thinking this product activation thing is going to be a nightmare.

Biggest controversy that’s most likely to make everyone else’s eyes glaze over: pick one:

• Whether to build from scratch or make some degree of use of a Recovered file;

• How $Variables and $$Variables ought to behave, versus how they do behave, outside of the scripts that set them;

• Whether FileMaker Inc blew it by going with Tabs instead of Template layouts;

• Naming conventions. You want 8 loudly shouted points of view, go into a room of 7 FileMaker developers and start ragging on someone’s field / table / etc naming rules;

• Data-separation model, single-file solution, or modular multi-file approach?

• Is anchor-buoy the best way to set up your relationships on your relationship pane?

Thanks for your interest. Well, take item 1 on my list. The US Playing Card company in Cincinnati makes ‘Bicycle’ Brand playing cards. The majority of magicians who do card msgic use Bicycle ‘Rider back’ playing cards. This isn’t because there is anything ‘trick’ about them. It’s because they are good cards, well-made, they fan and manipulate pretty well and yet they can be bought in bulk for a moderately low price (which can be important, given that magicians often get through more than one deck per week).

Some (like me) see nothing wrong with this, and like using ‘bikes’. Some take the view that, precisely because all the other magicians are using ‘bikes’ that they want to use something different and avoid the ‘herd mentality’. So they go for Steamboats, Fox Lake Aviator, Piatnik or some other brand/make.

Some (very small in number) seem to think it is more or less essential to use Bicycle cards, for a variety of reasons (some of which I can’t reveal because of the secrecy rules).

And some (another very small number) wouldn’t be seen dead with a deck of ‘bikes’, either because they just don’t like them as cards or because they think the brand has become tainted by association with magicians.

All of which is of course totally irrelevant to the average spectator in the average audience. They only care if the magician is good at his job and entertaining; not what brand of cards he’s using.

Even that can depend greatly on the anime too, I think, different studios animate in different styles, for different shows. I like to tell people “If you don’t like Anime, you’re just not watching the right Anime for you.”

Amusingly enough, the same expression works if you substitute “Gundam” for “Anime.”

Point noted. Sheridan taught very classical-Western animation; one of my teachers was from Disney, and Disney, Pixar, etc, hire from there extensively.

A lot of programming language debates can be summed up in one big meta-debate that’s been going on since the 1950s: Lisp or machine language? That is, should programming languages move towards high-level abstraction and rely on smart compilers to keep things efficient (the Lisp side), or should they remain fairly close to the hardware and rely on humans to keep things correct and readable (the machine language side)?

(To be fair, there are a lot of debates that don’t boil down to this (the indentation fight is one of them) and there are a lot of languages that aren’t fought over in these terms, either. Those languages are domain-specific languages, used by people in one specific problem domain to solve problems in that domain. Cobol is the most-used domain-specific language, solving problems non-technical businesses have with payroll and other accounting chores.)

Over the past half century, programming languages have been increasingly moving towards the Lisp side of things, but there have been some notable reversals. For example, Algol and later C displaced Fortran and assembly language as general-purpose application-building languages, but C has resisted efforts from both C++ and Java to displace it totally. (They have in some places, but not everywhere.) Even now, with increasingly parallel computers being built (something C has a hard time handling), C isn’t going anywhere for the time being. In fact, very high-level language such as Haskell would probably be better for writing parallel code than most humans writing machine code would be*, but it seems the odds of a Haskell-like language taking over are slim.

*(Why would Haskell be better? Because it can enforce a very comprehensive set of constraints on how data is passed around – something called ‘referential transparency’, for one – that allow the compiler to generate extremely good code that would break most people’s brains if they had to write it by hand.)

However, parallel computers might finally force even the most backwards application-building companies to move towards more Lisp- and Haskell-like languages, because writing software that has to be split across 128 or 1024 different processors at once is hard. You have to think about what gets done by who to what when, and how that new information gets told to everyone else without causing a total meltdown. If you want to strike the Fear of Finagle into the heart of a poor C++ programmer, sneak up behind him and chant “Spinlock, deadlock, livelock, mutex!”

Simply put, it’s a lot easier to have smart people write the compilers and, effectively, only think about the hard stuff once, come up with the right answers, and encode them into the compilers the rest of us use.

Finally, something that is NOT a debate in my field of geekery: Anyone who thinks all programming languages are equally powerful rewrites Quake, Hexen, and Doom 3 in RPG for the AS/400. (If you don’t know what an overpriced washing machine would be doing with a rocket-propelled grenade, don’t ask what RPG really stands for.)

Metaphysics

Ontology is all the rage again. The ontological argument for the existence of God particularly has made a resurgence, due to the work of Godel, Hartshorne, and (especially) Plantinga in the mid to late 20th century, formulating it with the modals of necessity and possibility. Suber has been the most successful (in my opinion) critic of the argument, but even he concedes its validity, while questioning its soundness, particularly in versions that use Becker’s Postulate. As far as I’m concerned, the proof is sound and conclusive in several of its forms.

Parsimony (Ockham’s Razor), once the darling of armchair philosophers everywhere (including me) is now considered by some to be… get ready… simplistic. But I think the problem is a tendency to invoke the principle as some kind of law, along with a popular but erroneous way to express the principle as “among competing explanations, the simplest tends to be the best”.

Kant is catching hell for his longstanding and heretofore unassailed notion that existence is not a predicate. Miller and others have now looked at the issue in a different light, namely that a thing may be individuated by the bounds of its existence. A thing bounded by its existence is the recipient of its bounds, and therefore meets the criteria demanded of predicate cases.

Epistemology

Knowledge as justified true belief (JTB) was attacked in the mid 20th century, but defenders are holding strong. The jury is still way out on this one. Personally, I like the JTB model.

Science is being mutilated by, well, mostly by scientists and students of science. Science is not everything conceivable plus everything that isn’t. Science does not even address, let alone answer, questions of an analytical nature. The most disconcerting thing a sharp debater can encounter is the demand from some idiot for scientific evidence of an analytic claim, such as 1+1=2 or God exists.

Aesthetics

Sometimes, I wish I weren’t autodidactic so that I could write for peer reviews. I have some original ideas about the teachings of Christ and goodness as an aesthetic. But without a degree, I would never be taken seriously.

Ethics

I include politics in this category. The noncoercion principle is emerging as a resilient axiom for formulating deductive systems of ethics.

I’m just informed enough about aeromodelling to mention that the big debate there is between Build It Yourself and Almost Ready To Fly. Crusty old modellers who got into it when they were still in short trousers in the 1950s and want it to be the 1950s all over again now they’re well into retirement decry the mass-produced bolt-together kits factory-built in Taiwan and piled high in hobby shops. While they (reasonably) bemoan the decline and fall of traditional design and build skills, they never quite explain how a hobby shop could stay open just on sales of balsa, tissue and dope.

I foolishly stepped into the debate just once. I’d have been better advised to join a “picky eating” thread.

**ianzin **- that is a perfect example - completely, utterly, totally insider geeky - with no real relevance to the task at hand other than, well, it matters to the practitioner - I love it! Got any more like that?

**Liberal **- whoa. :slight_smile:

**Derleth **- yeah, that’s kinda like a comment Sage Rat made in the other thread (the one about Safety Net tools vs. Finesse tools) about programming - makes sense…

I’m post-doc in systems neuroscience (‘systems’ referring to the fact that we study the operation of the brain as a system rather than focusing on the way individual neurons or components of neurons work). One of the cool things about neuroscience is that the basic principles of brain function have yet to be agreed upon; we often refer to the field as ‘pre-Copernican’, and I think the revolution will be happening in our lifetime.

There is a debate in systems neuroscience that was once extremely heated but that is cooling down partly, I think, because of how fundamentally important it is: how is information represented in the brain? You may have heard of ‘spikes’, otherwise known as action potentials, which are produced by neurons. It’s generally agreed that spikes are the method by which neurons send signals to other neurons. But the major question is how information is coded into spikes. The canonical answer is that neurons change their rate of spike production when signalling whatever it is that they signal (e.g., the presence of a visual object for visual neurons, a movement for motor neurons). But there’s a lot of interest in ‘alternative’ codes involving every imaginable combination of spikes: precise patterning of spikes from a neuron, correlations in spike timing between different neurons, you name it.

Myself, I think the evidence weighs heavily in favor of the classical coding scheme and that other phenomena that people have seen, like the fact that neighboring neurons do tend to produce spikes at about the same time, will turn out to be epiphenomema. But isn’t it incredible that the basic question of how information is represented in our brains is still wide open??

I love this thread – except for the fact that some posters, such as Liberal and spazurek, are decribing debates that may be esoteric or jargony in their details, but which have deep implications for all of us – and, to me, such a debate isn’t quite the kind of why-should-anybody-else-care geekiness that the OP seems to be looking for.

So, I’ll give you two less-than-earth-shattering debates from the world of human geographers: one newer, the other older.

The newer one is about the nature and importance of “post-colonialism”, and could perhaps be put: “can the interaction of societies and individuals in today’s world best be explained with reference to the colonial relationship (former or current, literal or figurative), or not?”

The older, perennial one, perhaps not unique to geographers (but I would wager a stronger worry among them than among academics with more “obvious” research responsibilities):“do geographers really contribute anything unique and valuable (besides the teaching of geographic facts, concepts, and processes to young people), or must they be defined by their sub-category (economic, biological, etc.) in which the real work is being done by the non-geographers (economists, biologists, etc.)?”

I’m surprised that Colibri hasn’t stopped by to tell us about the ivory-billed woodpecker debate: Was it, or wasn’t it?

Well, one debate in higher ed is how one should fund academic units. It you allocate tuition dollars to units for their course-by-course enrollment, you end up encouraging duplication because each unit will develop its own set of general ed courses (to keep the dollars at home). That is, engineering would offer its own composition courses, as would nursing, and have arts & sciences enroll only their own students in their composition courses. This is just not efficient.

But if you allocate money to units according to something other that course enrollments, you end up hurting units like arts & sciences, which tend to provide general ed for other units’ students, and for places like the Ed school which may have many students majoring elsewhere–that is, typically secondary ed students may major in an area like history but take pedagogy courses and student teaching through the ed school. In short, it’s hard to be fair without incenting units to do things that aren’t good for the University.

Affirmative action is also a huge debate but that’s something the man-on-the-street tends to have an opinion about.

Snorkels - useful backup safety device, or CO2 buildup/entanglement hazard waiting to happen?

Mares/Dacor HUB buoyancy compensator - terrible idea, or most terrible idea ever?

Spare Air: Nice to have as backup, or completely useless in an emergency?

My apologies.

Paper-crafting with rubber stamps and lots and lots of accessories. It is important to have all 65 colors of ink. <hee>

I am personally embarrassed to have bought the latest crank-handle contraption ($45) for dry-embossing when the embossing folders do work with a rolling pin.

I don’t know if 1. is really a biggie. Fischer-Spassky ‘the rematch’ showed that not playing top class chess weakens your game.

  1. is certainly live!

http://www.chessbase.com/newsprint.asp?newsid=2729

Linares tried to fine players while the Sofia agreement has been used a few times.

  1. I’d like to add ‘cheating using computers’:

On Monday a “Chess Cheating Town Meeting” at the historic Marshall Chess Club in New York will bring together some of America’s leading chess authorities in a panel discussion about ways to head off computer-assisted cheating in organized chess competitions. The subject has turned hot in recent years and the conference is well worth visiting.

Officials in India suspect just such audacity has been deployed by two leading chess players to cheat.
One is alleged to have stitched a mobile phone ear-piece into his baseball cap to get assistance from an accomplice with a computer

  1. And surely ‘Toiletgate’ will rumble on…

I was in a Ph.D. program in behavioral neuroscience at an Ivy League university about 10 years ago. I lost steam quickly because the questions that were available to work on were just so basic. I wanted to understand something about how the brain actually FUNCTIONED as a whole and I quickly learned that just isn’t possible now. I am on a mini-mission on this board and in life to let people know just what a big mystery the brain still is. People in general tend to hold this paradoxical idea that understanding the brain is really, really hard yet there must be smart people out there that understand it well. Some of those cover stories in Time and Newsweek make it seem like it to them sometimes. It just isn’t so and that is both wondrous and disturbing at the same time.