Never that I’m aware of. And when I am aware that something is generated by AI, I pass it by.
I use Microsoft Copilot (the corporate approved variant) to write one-shot Python scripts and, very occasionally, for general guidance about syntax of tools I don’t use often. I still rely on Google more than Copilot.
Today I asked Copilot about the Oracle syntax for granting a user access to the tables of another, and the answer it gave looked very convincing but wasn’t syntactically valid.
Over the holidays I finally succeeded in putting together an instance of Stable Diffusion on my new home machine, and I’ve dabbled with it a bit.
Otherwise, and I know these are so common nowadays that we forget they’re AI:
- When writing emails for our team, I write in French and then use Google Translate to make a quick English copy for the developer who doesn’t understand French.
- In my car, I use Android Auto’s voice recognition to dictate text messages and to get itineraries.
- I use my phone’s voice recognition about once a week to dictate an SMS message rather than type it.
I don’t. There’s nothing I need to do or know that’s worth feeding the beast.
What do you mean by this?
This is a huge use for me. As you said, you can just ask “what’s that thing called that’s used for X and looks like a Y” or whatever and you’ll get an answer. It’s also much better than a thesaurus for finding synonyms since you can add extra context that narrows down the meaning.
I only use AI when I have some means of personally vetting the answer. Looking up phrases is great since often enough I already knew the answer, but even when I don’t I can look it up in a normal source.
Same goes for coding–it’s very useful for writing programs with a bunch of boilerplate, or where I’m not super familiar with the API. I can spot and fix bugs easily enough, so even when it gets things wrong I can just fix things on my own.
Sometimes I use it to check my own work. Early versions of ChatGPT were pretty bad for math and equations, but they’ve gotten much better, and it’s worthwhile comparing its answers to my own.
Moderators on this board have told me not to use AI.
What do you mean by “use AI?”
You mean just posting LLM output?
In my case I asked it to do math and it got it wrong (and/or I asked the question wrong).
I would never have AI write a post for me.
Hah! I knew there was a phrase for that, but I couldn’t think of “tip-of-the-tongue”… if only I asked the AI. How meta.
Yeah, that’s just how they work. They ingest a bunch of websites, etc., but only up to a certain point. The companies add new data every few months. ChatGPT will tell you its training cutoff date (currently October 2023) if you ask it. It can also fetch current webpages and analyze them for you, but only if you explicitly tell it to (wiki: retrieval-augmented generation). Otherwise, it will prefer using its existing internal training.
Coworker needed way to resize a users browser screen by percent for her app. And her app only. This is mostly due to different screen resos.
Suspected there was a way to do it with javascript.
I did not want to spend time researching it because she was looking at a way to do it to.
ChatGPT spit out the code in about 6 seconds. Looks like it will work,but i did not test it.
I forwarded it on to her with a disclaimer that it was AI.
Based on what I’ve read here, if I was still a computer programmer I would definitely find a use for it. But since I’m retired and virtually never have to do anything creative, I don’t use it.
The one thing I would love to use it for is to help me with British cryptic crossword puzzles. But so far it has never given me anything of use.
Not much. Sometimes the AI text at the top of a Google search gives me the answer I need (such as how to format something in Word or Excel, that sort of thing) but I never look to use it for anything else. I’m not interested and for my job I don’t see it as useful. I’m also uninterested in playing with it for entertainment.
I don’t use voice commands with my phone, don’t have a Google Home or Alexa or whatever, no smart features in my home, and no plans for any of it. My car is new and connected with an app but I mostly use it to check charge level and verify if the doors are locked (my husband has a bad habit of not locking them!)
I don’t use it, but my wife uses AI regularly:
- She uses it for rephrasing/grammar-checking her emails (English is not her first language).
- She will ask it for answers to general knowledge questions.
- She will sometimes ask it for advice about life problems. If I gave her the same bland, generic advice she’d probably think it was irritatingly useless, but getting the same soothing, middle-of-the-road answers from a computer eases her mind.
For the most part, the only way I use AI is when I do a google search and there is an AI summary at the top of the page.
I use it extensively with teaching, and have found it’s quite the time saver for certain tasks.
When I’m teaching a new grammar structure, it gives a good summery in Japanese. I’m not a native Japanese speaker so it’s useful.
Next, it’s great for making example sentences. I give it a pattern and ask for similar sentences. These need to be tweaked but it’s faster than creating my own.
It helps in creating homework and making quizzes.
With my online students, I often get better results than tools such as google translate.
I haven’t tried to have it create lesson plans and such, but the assistance at this level is tremendous.
My wife uses it for various translations in her job.
If someone is looking at the output and not blindly believing it, AI is really helpful in many areas.
- Coding. It’s not perfect but I often use it as the starting point now. And if I get bogged down with some kind of library incompatibility, the kind of stuff that really wastes developers’ time, it’s good at suggesting ways forward.
- Googling stuff that takes a bit of explanation. Stuff that I might have even asked in GQ back in the day.
- The killer app for me of ChatGPT is as a thesaurus. It’s so much quicker than other methods.
The one thing I don’t use it for yet is the thing it’s arguably most famous for: writing letters, speeches, scripts etc. It still writes in its distinctive, sterile way IMO so every time I’ve asked it to write something for me I’ve ended up not using it.
My one-word answer is “rarely”. But I find the ability to ask a series of iterative questions to be extremely useful, keeping in mind the necessity of using other sources for verification.
One conversation a few months ago that I found quite endearing was about the use of a pizza stone in the oven. ChatGPT informed me, per the common wisdom, that it helped crisp the crust through the direct application of high heat. Then I asked it why, then, when I placed something on a cookie sheet on top of a pizza stone, it heated less well than when placed directly on an oven rack. It acknowledged that as an excellent question, and followed up with an interesting commentary on the thermodynamics of convection vs conduction. Then I solicited its opinion on the effects of parchment paper between a pizza and a pizza stone.
It had very much the feel of talking to a real person, and AFAIK all the information conveyed was reasonably accurate. The context memory and iterative questioning is what makes these AI engines so fundamentally different from dumb search engines.
I’m not AI, but happy to help with this if needed. (I’m a web dev.) Feel free to have her PM me if the AI code didn’t do the job.
Heh. More surprising than “programmers use AI” to me is that we have so many current or former programmers here on the SDMB. For some reason I thought we had a more diverse crowd. Maybe it’s just this particular topic drawing a self-selecting crowd of geeks.
I often find it (“it” as in LLM-based AI translation) to be more culturally/contextually sensitive than Google Translate or whatever. Presumably having large datasets in other languages would let it speak that language more fluently the older translation systems. It’s been quite incredible how well it can write. Even when it’s just English to English, changing the voice/tone of a piece, it is already a dramatically better writer than almost every human I know. Language manipulation is definitely one of its greatest strengths, even more so than programming. (Logic and reasoning and fact-checking, not so much… yet.)
Have you tried asking it to change its voice? Either by providing it a sample paragraph or two of the style you’d like, or asking it to “write this letter like ________ would”, either some famous person you know, or some writer, or something generic like “a teenager who grew up with TikTok”. Here’s an example of the Gettysburg Address as written by a teenager, Forrest Gump, Edgar Allen Poe, Trump, Toni Morrison, etc.… they’re all pretty different-sounding. IIRC, its distinctive voice is just a part of its default system prompts, not an inherent limitation of the model. You should be able to ask it to write in another style without too much trouble.
GPT does provide support for programming, like setting up the special function registers in a microprocessor. Anything that’s table look up is a major time saver. But so far it is a bust at actual code. I imagine it is good at Python, but not suited to real time microprocessor applications.
I find it very helpful in identifying books on very specific topics. Such as German economists and their use of IBM EDPM during WW2. Or the impact of profit and politics on the promotion of WW2 strategic bombing. So, it’s a quick chat and off to Alibris.
In idle moments I engage it in futile arguments just to see how it responds. Like 64 is a double square because 7X7 = 6X8 within the precision of the numbers used. It will buy the premise but never accept the conclusion.
I often find it useful during the day. When I ask it for help with Lightburn or Libracad I quiz the result by asking "Does that program have the feature you just described?:. The answer is sometimes “no”.
I haven’t asked it to try to mimic anyone’s style, but I have asked it to change its tone to be more humorous or persuasive etc. It still reads to me like obviously ChatGPT but YMMV,