Student Fails To Hack Into Computer, Sets It On Fire Instead

I wonder if if was a computer science test… :smack:

I can’t condone arson, but I also don’t think counselors should be changing grades for peoples’ tests, either.

If they take my stapler then I’ll set the building on fire…

The title makes it sound like the setting on fire was a direct result of the attempted hacking, like one of those fictional computer viruses that physically destroys the computer.

You know, there’s parents out there thinking, “Okay, that’s the wrong way to go about it, but I wish my kid cared a little more about his grades.”

I immediately thought of AMC’s show Halt and Catch Fire.

Did the student think the grades are only stored on that computer? And that there were no backups? Big failure there in understanding technology.

Not so fictional… OK, the chances of it happening nowadays are basically nil, but in the past the concept of “killer poke” was known and it definitely could happen. In old machines that allowed full access to hardware via machine code, and lacked the proper protections, you could indeed inflict physical damage to components via malicious code.

Check it here: Killer poke - Wikipedia

This concept is also referred to sometimes as HCF (Halt and Catch Fire).

No, no. The counselor would have helped him with the hacking.

[The files are IN THE COMPUTER…

…it’s so simple…](The Files are in the computer - Zoolander - YouTube)

The HCF command on old Motorola processors put the CPU in a loop where it incremented the program counter. It didn’t actually make the CPU catch fire, and had nothing to do with the things mentioned in the article.

I know of a case where the cpu actually did catch fire, or more accurately, melted down. The old version of a certain test tool would shift date through just about every flip-flop on the design at full speed. The cooling system was not designed for this really worst case. The first time they tried it, the cpu melted right through the very expensive prototype board. I know both the test person and the tool vendor, and they were still pissed about this situation (and blamed the other) years later.

Back in Ye Olden Dayes I recall getting an n[sup]th[/sup]-generation photocopy of a joke quick-reference pocket card of System 370 opcodes including such winners as

“POD” -> Pound On Drum.
“CFBU” -> Catch Fire and Blow Up.

I recall that another one was

“EOI” -> Execute Operator Immediately

I burning your firewall Yankee dog!

One I would gladly have used on several occasions. We had one clueless second-shift guy who caused more data destruction than any modern hacker ever dreamed of. Somehow Dev always got the call to clean up the mess.

You can still write code that will physically damage computers, although it is trickier than it used to be. If you can manage to get to the firmware of anything with sufficiently low-level hardware access, you can break stuff. Disconnect power from the fans and overheat, or spin any motors faster than they were designed for. Hard drives generally need some careful control software to keep things going. Crash the firmware and you can crash the heads.

“Lieutenant, you are looking at the only Starfleet cadet who ever beat the no-win scenario.”

I would assume that he initially assumed he would be able to hack in, change his grades, and then put everything (other than his grades) back the way it was without leaving any sign of his activity. Backups only enter into the picture if the admin knows something nefarious has happened.

I once worked with mainframes that could be powered-down accidentally with a user-mode instruction. I Googled for it just now, getting a single hit: :wink:

In the Golden Era of Mainframe Computers, we noticed that programs would sometimes display errant behavior (also known as “bugs”), but when we tried to demonstrate the same for the programmer to see, it wouldn’t happen again. (This phenomenon is known, of course, to all automobile owners whose cars suddenly work fine when the mechanic is taking a look.)

We felt sure that the machine could tell if the programmer was present, and was programmed to behave differently if so.

This could only happen by the use of the BPP op-code: Branch if Programmer Present.