Fess up... Have you used AI deliberately?

That’s my big problem with current AI, and why I don’t use it much. What I would find really useful is something that could answer factual questions posed in a natural way. AI just isn’t reliable for that kind of work yet.

The other thing I really could use for my hobbies is a system that solves cryptic crossword clues. I’ve tried it a few times on ChatGPT but I only get nonsense.

Datacenters have always been heavy power users. Your Google searches haven’t been run on some guy’s laptop for the past 20 years.

As for NVIDIA chips specifically, they’re very efficient. The Greet500 list tracks the most efficient supercomputers, and the top entries are all running the same NVIDIA chips used in AI datacenters (H100 and GH200):
https://top500.org/lists/green500/2024/06/

In fact they’re an even better fit in AI since they have low-precision modes that are sufficient there.

Of course, the computations themselves are heavier than a traditional Google search. But they’re obviously more powerful. I see various comparisons with lightbulbs and such… but not with the equivalent energy used by a human. If the AI can bang something out in seconds that would take me 20 minutes, it’s almost certainly a huge win, energy-wise. That amplifies the productivity of everyone with access to the tools, getting much more output for only a slightly higher cost.

It’s like any other tool. That hammer undoubtedly took a large amount of energy to produce. But it means one person can pound nails faster than a dozen with rocks. Same thing here, just higher tech.

I don’t see it as something to ‘fess up’ to. I write code for statistics and data analysis as part of my job, and occasionally use AI (Google Gemini) to produce something for me, or to convert from one language to another. Of course I test it. Sometimes it works, sometimes it doesn’t.

What should I do instead, look it up in the printed user’s guide?

Never for work.

But AI is fantastic for visual mockery of dishonest, clueless antivaxers (they really dislike being portrayed as decrepit sealions in tinfoil hats ).

We have a company-specific AI for internal use. I was working on a client proposal, and one of the reviewers wanted me to provide more info on a topic. I thought what I wrote covered everything, but a team-mate suggested I ask the AI tool. Bingo - I had 5 additional sentences I edited into another paragraph. Job done, and writer’s block averted.

A few weeks later the same team-mate asked for help polishing the writing on a document. I suggested the AI. He said since everyone in the company is using it for proposals, the output always reads like a proposal and he was going for a different approach in this case.

I also tested the AI by asking for a summary of the area I work in. It returned a nice paragraph, but missed one key point. I asked it for a few sentences on that point, and it returned a good summary of that as well.

My conclusion is it’s handy for scutwork and first drafts, but you need to check its work.

I can see where it’s useful for any kind of creative writing work. But in my current life I virtually never have to write anything besides texts and message board posts. So that’s why it’s not so useful for me. My wife did use it to write a recommendation letter for a friend who was applying to a school.

I’ve used it to write up observations and analysis of children’s learning. I give it the bullet points of what I observed and it will turn it into a little story, and make comments about what skills the child is displaying or developing. It takes me 15-20 minutes to write something up, and it takes ChatGPT only 3. It is surprisingly good at it, though I sometimes edit to make a specific point.

I don’t do it a lot, because I am pretty sure my boss and a lot of parents would be horrified. The thing is, I know what it needs to say; it is just faster at writing than I am. But someone who didn’t know what to say could do the same thing. So it makes the reports a bit meaningless (which maybe they are?).

I’ve also asked it to produce follow-up activities. It is extremely bad at that, though it can spark some ideas.

Yes. For music, vocally… I love the band Chicago, but on XI (last album before Terry Kath shot himself) the horn section starts to sing lead. I inserted Terry’s (and Cetera and Lamm to compare) voice, but I also did this with other songs, by other artists…

I took a Led Zeppelin song unreleased instrumental, went into my studio, and did a quick vocal without even having lyrics in my head, just roughing it, and uploading it wondering if anyone would notice the difference, and so far, no one hasn’t, including a few people I would e-mail it to.

Yes, I recently tried it out to create some fairly complex powershell scripting. It worked shockingly well.

Aside from that I find the “This _____ does not exist” photos entertaining.

Used ChatGPT to create my 3rd quarter “Smart Plan” goals this year. The bosses think they are great and specific, measurable, achievable, relevant & time bound.:wink:

I am planning a 4-5 night backpack trip in an area I’m not real familiar with and fed Copilot some info (number of nights, range of miles per day, request to stay on ridges as much as possible, make it a loop) and it came up with something that at first glance looks feasible. One monster day, but that can be the nature of the beast with trips like this. I need to dig in a bit more and refine, but thanks for the idea of using it for trip planning. I expect it will be useful for our long motorcycle trips as well!

I registered for CHATGPT 4 (it’s about $40/year). I asked it a contemporary question (somethat did not exist in 2022) and it popped right up.

This is SO worth it.

No joke, this is a huge thing. Some employers are actually requiring programmers to use AI as well as putting tools in place to measure how much they’re using it, how much they’re accepting suggestions, accepting with modifications, or just rejecting. I don’t know about other domains, but for coding it’s definitely here to stay.

It’s good for figuring out syntax in languages that aren’t my daily driver. I could probably use it to write a lot more boilerplate than I do, unit tests and stuff, but that involves actually thinking about prompts. But once I’ve thought through a complex prompt, usually that teaches me enough to write the code myself, with more confidence that it’s going to be right.

My research is in applied AI soooooo yes :wink:

This 100%. Having snippets of code to build off of is how I program.

I did ask it the other day if something was possible (had to do with Google Analytics). I figured that it was, but did not have to chase down an answer. It spelled out a road map of how to do it. It’s great.

Until chatgpt can rack and stack and cable itself, my field should be safe. Though I’ll concede it might could configure itself, or at least others of its kind.

I don’t, I won’t, and I wouldn’t even know how to find it. And, considering the results I’ve seen posted here in threads, I am NOT letting AI do engineering.

Just found out today there’s a monthly limit of 1500 on how many times one can use Adobe’s generative AI, which includes Firefly and all the apps that have just incorporated that tech (Photoshop, Illustrator, InDesign, After Effects). Text to image, Generative Fill, and editing Adobe Stock images count. My manager is a little freaked about it, because testing and goofing around with it counts, and he had done so about 100 times.

I’m an English as a Foreign Language teacher. I have used AI:

-With my students to answer some questions and help them practice their relevant real-life skills

-To speed up the process of writing tests (IIRC) and student evaluations.

I’m a software developer. The hard part of my job is finding what’s wrong in our 2 million lines of C code. The easy part is writing the code to fix the issue once I’ve found it. We can’t give an AI access to our source code so it can’t help with the hard part of the job. New features are the same story - I’m never writing some basic algorithm from scratch, which is what AI seems to be good for. I’m writing stuff that uses our existing functions and data structures. Again, I can’t share those things with an AI so it can’t help me to write new features. We’re going to use it to write regression tests but what we really need help with are unit tests, and that’s another case where it would need to see our code so it’s a no go.