Right. Both the start and end times are suspect. Consider the Space Launch System. Is that just one program? It evolved out of the Constellation program, which evolved out of various Shuttle-derived designs like the National Launch System, which of course all came out of the Shuttle program. It uses engines which aren’t just built to the same design as the Shuttle engines; they’re literally salvaged from that program. So should the first SLS launch “cost” that of the entire history of Shuttle-like systems? And that’s not even doing anything silly like lumping anything rocketry-related under one program. It’s all stuff that was a direct antecedent of SLS. It adds up to several tens of billions of dollars.
Ark of the Covenant. Its priceless. Takes 4 people to move it. And it’s in a small village (supposedly)
This is a goof for legal reasons.
That’s a mighty big warehouse for a small village.
I believe the OP is looking for things that can be shown to exist.
Not built but definitely exists: The state of California. Actually the whole of North America, South America, Eurasia, and Africa are moving.
The Honjo Masamune katana, listed as “priceless”, likely more like $100M.
Really did exist, likely still does.
I don’t know about this one. While the Trinity test was the first device exploded. Manhattan had
also produced enough material for (at least) a second device and within days of the Trinity test, Little Boy (a backup design based on uranium rather than plutonium) was ready.
Apropos of the conversation above, it seems to me that the first product off an expensive assembly line shouldn’t count.
I don’t really disagree, but as noted the boundaries are fuzzy no matter what we’re talking about. Do we only ever count the marginal cost? That seems silly, but then we have to ask exactly how to divide up the R&D cost. And what even counts as the R&D going into a single object? How long do we have to wait for the second, or tenth, or millionth instance to be manufactured before we declare that a thing costs a certain amount?
I read in this article that antimatter is just a little bit more expensive than what you quote:
[…] only a few dozen nanograms have ever been produced artificially. This also makes it far and away the most expensive material in the world to make, with scientists estimating that it costs up to US$25 billion per gram. Part of the difficulty and cost comes from storage, because of course it’s not as easy as just sticking it in a jar, since it will annihilate most containers on contact.
[…]
If storing antimatter seems tricky, transporting it is a whole other level of challenge. In 2020, CERN detailed a new design for a trap that could be used to move large quantities of antimatter over longer distances.
I guess when they speak of “large quantities of antimatter” they have something in mind you could not see with the naked eye. Or only for a very very short time.
I’m disappointed to see that the CERN antimatter storage device does not use dilithium crystals.
Outdated technology. The current state-of-the-art is beryllium spheres.
Yeah.
Remember that big orange tank of fuel on the Space Shuttle and its two boosters? IIRC one gram of antimatter is enough to launch 12 Space Shuttles.
A little bit of that stuff would go a long way.
Also, forget nukes. A gram of this would demolish a big city.
All the anti-photons from the anti-matter would quickly demolish your retina.
Maybe Janet Yellin will end up minting the trillion dollar coin.
As far as we know, photons are their own antiparticles. So any photons emitted from antimatter would act identically to photons emitted from normal matter.
Awww, come on guys, it’s so simple. Maybe you need a refresher course.
[leans arm on hot engine part]
Hey!.. It’s all ball bearings nowadays.
1 gram has a yield of around 40 kilotons.
But around half of that energy gets carried off in neutrinos, so call it 20 kilotons of bang per gram.
Specifically about the cost of the first item off a production line and R&D cost, from a product costing perspective there is a concept call absorption where you package all the cost over a period together and divide over the number of products made in that period. Those costs that get added in are the materials cost used to make a specific item including scrap overhead, the labor hours to make it, outside costs for the part, variable manufacturing overhead over the period ( electricity , water stuff that cant be attached specifically to an individual part) and the fixed manufacturing overhead such as the monthly land lease and monthly plant depreciation and support/supervisory overhead that are not general sales, and admin costs, so the CEOs private jet flights don’t get added in.
All that is a long way of saying no one ever reasonably assigns all the cost into the first item off a production line, so for purposes of this thread that would also be an unreasonable thing to do as well.
Marginal cost is the almost the same but eliminates the manufacturing fixed overhead and is often the incremental cost for one more item.
Regarding how companies assign R&D and product developments costs to a product, they have to be expensed (ok some accounting rules do vary in that respect) , but you cant really spread those costs out over multiple manufactured items as it is not a capitalized item so those also don’t go into the cost of the first or any item off a production line. That is the way it is reasonably done so it is probably reasonable here.
If you are building a prototype, then engineering time, special test jigs, material and specific outside cost for that part may go against that prototype, but not the whole R&D budget.
If you wanted to look at the total cost of a project rather than an item, for sure add up everything R&D , CAPEX, COGS and divide by the number of items made by that program to get the cost per item, so long as you can say all the plant, all the R&D value all drops to zero after those items are made, however that is rarely the case.
For a product/project economic assessment you would generally add up the R&D engineering that could be tied specifically to the project and no other, all the protype and test materials, the per item COGS multiplied by the number of items you expect to make over the planed period, and the depreciation associated with that (either as absorbed cost or separate depreciation line, but just over the period you are looking at and just that that was used for the items in question). You then figure out your revenue from the products minus all the direct product sales cost (not your CEOS private jet) , and as al those expenses are up front and the revenue is down the line you end up doing Net present Value calculations according to your cost of capital.
You could take all those cost and deprecation and divide by the number of item made, but generally the cost of an item is either the absorbed or incremental cost.
The project or program value is way more than a prototype or a single item so to my mind the project cost is not the same as the item or prototype cost. For the trinity test although there was only one bomb available at the time, more material was being produced, different designs being investigated etc so if the trinity test failed, the project value didn’t drop to zero as there were other paths or fixes to the trinity test, so the trinity bomb was not worth the whole sunk cost.
For the antimatter and other exotic iotopes, if antimatter is 25 billion per gram, and they have only made a few nano grams , then the most expensive antimatter article ever made is a $25 times however many nano grams they made. Has someone made a salt shaker full of Calcium 48?
The bomb dropped on Hiroshima in WWII was about 15 kilotons for comparison.
Each USS Gerald R Ford class is around the same as the Prelude FLNG vessel.
Later ones may cost more so may overtake Prelude, unadjusted.