Snowboarder_Bo:
I just read an interview with the Top Gear/Grand Tour guys , and I love this part:This is a classic case of someone being an expert in one field and having absolutely zero knowledge of other fields. We have robots that can make sandwiches, boil eggs, climb stairs and open doors. FFS, we have some that can do standing backflips.
Well damn, why am I still making my own sandwiches? Where can I get one of these sandwich-making robots for my house?
$22,000 for Baxter
$29,000 for Sawyer
Kind of expensive for your house, but if you eat a lot of sandwiches, maybe it’d be worth it.
Well damn It’s cheaper to get married again.
(kidding)
There was a good article at Motherboard earlier this week*: AI-Assisted Fake Porn Is Here and We’re All Fucked
There’s a video of Gal Gadot having sex with her stepbrother on the internet. But it’s not really Gadot’s body, and it’s barely her own face. It’s an approximation, face-swapped to look like she’s performing in an existing incest-themed porn video.
The video was created with a machine learning algorithm, using easily accessible materials and open-source code that anyone with a working knowledge of deep learning algorithms could put together.
It’s not going to fool anyone who looks closely. Sometimes the face doesn’t track correctly and there’s an uncanny valley effect at play, but at a glance it seems believable. It’s especially striking considering that it’s allegedly the work of one person—a Redditor who goes by the name ‘deep fakes’—
Instead, deepfakes uses open-source machine learning tools like TensorFlow, which Google makes freely available to researchers, graduate students, and anyone with an interest in machine learning.
Artificial intelligence researcher Alex Champandard told me in an email that a decent, consumer-grade graphics card could process this effect in hours, but a CPU would work just as well, only more slowly, over days.
“This is no longer rocket science,” Champandard said.
The ease with which someone could do this is frightening. Aside from the technical challenge, all someone would need is enough images of your face, and many of us are already creating sprawling databases of our own faces: People around the world uploaded 24 billion selfies to Google Photos in 2015-2016. It isn’t difficult to imagine an amateur programmer running their own algorithm to create a sex tape of someone they want to harass.
Add in Adobe’s new software that let’s you make people say things via an algorithm that takes less than 20 minutes to mimic anyone.
Now add in the software that can make anywhere be anytime .
:eek:
I think our definitions of privacy and trust are about to change a lot .
*ETA: I see on re-read that that article links to the Motherboard article; nice!
I jut found out about Lyrebird (Liar Bird; get it?), a company that says their algorithm can mimic a person’s speech with only a 1 minute sample. The examples at that site are pretty convincing, but no way in hell am I giving them a sample of my voice to try it myself, just in case it is that good.
JohnT
December 31, 2017, 9:59pm
47
AlphaZero AI beats champion chess program after teaching itself in four hours
AlphaZero, the game-playing AI created by Google sibling DeepMind, has beaten the world’s best chess-playing computer program, having taught itself how to play in under four hours.
The repurposed AI, which has repeatedly beaten the world’s best Go players as AlphaGo, has been generalised so that it can now learn other games. It took just four hours to learn the rules to chess before beating the world champion chess program, Stockfish 8, in a 100-game match up.
…
“Starting from random play, and given no domain knowledge except the game rules, AlphaZero achieved within 24 hours a superhuman level of play in the games of chess and shogi [a similar Japanese board game] as well as Go, and convincingly defeated a world-champion program in each case,” said the paper’s authors that include DeepMind founder Demis Hassabis, who was a child chess prodigy reaching master standard at the age of 13.
Thought y’all would find this interesting.
AIs have out-scored humans on Stanford’s reading comprehension test :
Chinese retail giant Alibaba has developed an artificial intelligence model that’s managed to outdo human participants in a reading and comprehension test designed by Stanford University. The model scored 82.44, whereas humans recorded a score of 82.304.
The Stanford Question Answering Dataset is a set of 10,000 questions pertaining to some 500 Wikipedia articles. The answer to each question is a particular span of text from the corresponding piece of writing.
Alibaba claims that its accomplishment is the first time that humans have been outmatched on this particular test, according to a report from Bloomberg. Microsoft also managed a similar feat, scoring 82.650 — though, those results were finalized shortly after Alibaba’s.
Also, we now make prostethics that feel.
A woman who lost her arm over 20 years ago has received the first portable bionic hand, which through a series of tiny electrodes and sophisticated sensors, has restored her sense of touch.
The technology unites the portable bionic hand with a computer that translates the information coming from the artificial fingers into a language the brain can understand, which it then sends back to the body through the electrodes.
This breakthrough is the result of many years of robotic research carried out by teams in Italy, Switzerland, and Germany. Even though she’s central to this amazing innovation, Almerina Mascarello, who was chosen to test the prototype for six months, doesn’t feel like a superhuman. Instead, she told BBC that the prosthetic limb gave her back some of life’s simple pleasures, such as getting dressed or tying her shoes with no help. “All mundane things, really, but important. You feel complete,” she said.
Things are moving fast, eh.
Chronos
January 18, 2018, 4:03pm
50
No, that’s Lyrebird . Which isn’t even named for its vocal abilities: Its tail plumage looks like the harplike instrument.
A lifeguard in Australia used a drone to rescue two swimmers. The rescue took about 2 minutes; without the drone it might have taken 6 minutes for a person to get to the pair. Cite and cite .
A $430,000 investment by the government pays off in the first hours of being in use by offering zero risk to the operator of the drone while saving two lives; that’s totally fucking rocks!
Amazon Go store opens with no cashiers, no lines and no registers.
Shoppers enter by scanning the Amazon Go smartphone app at a turnstile. When they pull an item of the shelf, it’s added to their virtual cart. If the item is placed back on the shelf, it is removed from the virtual cart. Shoppers are charged when they leave the store.
The company says it uses computer vision, machine learning algorithms and sensors to figure out what people are grabbing off its store shelves.
I’m telling ya, this shit is moving faster and faster. And once we figure out how to accumulate knowledge/tasking from several AI bots into one, it’s gonna snowball like nothing we can imagine. I’m gonna guess much less than 10 years until that happens.
They’re still tiny, but mid-air holograms are now a thing that humans can do. Cite . Pics at the cite. ETA: There are gifs made from videos but no actual videos at the AP page.
Trapped particle makes 3D images is the article at Nature and there’s a link to the paper itself there.
I doubt we’ll have consumer versions available in the next decade, but it’s still cool as fuck.
XT posted this in the GD thread What happens when the robots (peacefully) take over? , but I thought it would be great to include here as well:
Thought I’d link to this series on AJ+ that delves into this subject. I was going to post this in IMHO or another forum but saw this thread pop back up again on the radar so figured I’d post it here. Basically, this series talks about the future of automation and what jobs are likely to be automated in the near future and what the impact of those may mean. There are 4 videos in the series that discuss different aspects, I think they are all about 15 mins or so. It’s interesting stuff even if you know a bit about this. If you don’t’ know that much about it it will really help.
This is new and cool:Nasa releases selfie taken by mars rover Curiosity
Released this week, the photo shows Curiosity in the middle of the dusty, red Martian terrain, with Mount Sharp in the background. The rim of Gale Crater is also visible.
A small, self-focusing camera on the end of Curiosity’s arm took the selfies. Dozens of pictures, all snapped Jan. 23, were used to create the mosaic.
The picture is awesome and the article tells us we can expect even more pics soon, prolly:
Here is NASA’s InSight page , which features an excellent countdown-to-launch clock.
Those Boston Dynamics videos are impressive. Like Bo , I thought they were already amazing a few years ago but they have continued to improve at a dizzying pace.
The posts about artificial videos sent me down some serious rabbit holes, following link to link and watching dozens of videos.
Snowboarder_Bo:
I just read an interview with the Top Gear/Grand Tour guys , and I love this part:This is a classic case of someone being an expert in one field and having absolutely zero knowledge of other fields. We have robots that can make sandwiches, boil eggs, climb stairs and open doors. FFS, we have some that can do standing backflips.
Yeah, wow. Just amplifying and passing on the ignorance.
Snowboarder_Bo:
So, uh, here’s a thing : Google’s Deep Mind AI built another AI called AlphaZero. AZ is similar to AlphaGo Zero: AGZ was made to master the game of Go, AZ was made to master chess.
Which it did in 4 hours, without any human interaction.
That is crazy. The evil side of me wishes I could see the looks on the Stockfish programmers’ faces when they learned about this.
Sage_Rat:
For anyone who didn’t see it:
Soft robots that are strong
This is quite important for robots that can safely interact with humans. It may also be useful for making them lighter.
That is WILD. A 1kg muscle that can lift 1,000kg? And feel soft? Amazing.
These two CNET articles, about a sex doll company that is transitioning to being a sex android company, are pretty interesting. They do depict artificial nudity, just so you know.
[spoiler]An inside look at how Abyss Creations makes sex robots - CNET
https://www.cnet.com/news/abyss-creations-ai-sex-robots-headed-to-your-bed-and-heart/[/spoiler]
Still early days, way in the uncanny valley, but coming along!
That took me awhile to look at and read all of it; it was fascinating.
The pictures were amazing, particularly the finished eyes.
The article was extremely in-depth. I thought it did a decent job of bringing many perspectives into the discussion, but I was glad the author focused on the owners and on the CEO (ostensibly the “pro-” side), as it’s really their story.
The video of his conversation with Harmony was fun and funny and showed me that they really aren’t that far along yet with their digital personalities. OTOH, the article also details that the AI does log data on conversations and thus “learn” things, so perhaps that was a part of the awkwardness and non-sequiturs.
If AI is anything like the internet, porn will be the industry that leads the way.
Story here .
The arm was developed by Johns Hopkins Applied Physics Lab as part of their program Revolutionizing Prosthetics. The aim of the program, which is funded by the Defense Advanced Research Projects Agency (DARPA), is to create prosthetics that are controlled by neural activity in the brain to restore motor function to where it feels entirely natural. The program is specifically working on prosthetics for upper-arm amputee patients. While this particular arm has been demoed before, Matheny will be the first person to actually live with the prosthesis. The program does hope to have more patients take the tech for a longterm test run, though.
While the prosthetic device is impressive, it’s not a limitless, all-powerful robot arm. Matheney won’t be able to get the arm wet and is not allowed to drive while wearing it. Keeping a few rules in mind, Matheney will otherwise be free to push the tech to the edge of its capabilities, truly exploring what it can do.
There’s a 3 minute video on the page that is fucking astounding to watch.