this post was submitted on 28 Jun 2024
76 points (100.0% liked)

TechTakes

1401 readers
202 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 1 year ago
MODERATORS
 

AI Work Assistants Need a Lot of Handholding

Getting full value out of AI workplace assistants is turning out to require a heavy lift from enterprises. ‘It has been more work than anticipated,’ says one CIO.

aka we are currently in the process of realizing we are paying for the privilege of being the first to test an incomplete product.

Mandell said if she asks a question related to 2024 data, the AI tool might deliver an answer based on 2023 data. At Cargill, an AI tool failed to correctly answer a straightforward question about who is on the company’s executive team, the agricultural giant said. At Eli Lilly, a tool gave incorrect answers to questions about expense policies, said Diogo Rau, the pharmaceutical firm’s chief information and digital officer.

I mean, imagine all the non-obvious stuff it must be getting wrong at the same time.

He said the company is regularly updating and refining its data to ensure accurate results from AI tools accessing it. That process includes the organization’s data engineers validating and cleaning up incoming data, and curating it into a “golden record,” with no contradictory or duplicate information.

Please stop feeding the thing too much information, you're making it confused.

Some of the challenges with Copilot are related to the complicated art of prompting, Spataro said. Users might not understand how much context they actually need to give Copilot to get the right answer, he said, but he added that Copilot itself could also get better at asking for more context when it needs it.

Yeah, exactly like all the tech demos showed -- wait a minute!

[Google Cloud Chief Evangelist Richard Seroter said] “If you don’t have your data house in order, AI is going to be less valuable than it would be if it was,” he said. “You can’t just buy six units of AI and then magically change your business.”

Nevermind that that's exactly how we've been marketing it.

Oh well, I guess you'll just have to wait for chatgpt-6.66 that will surely fix everything, while voiced by charlize theron's non-union equivalent.

top 35 comments
sorted by: hot top controversial new old
[–] [email protected] 36 points 4 months ago (1 children)

So, if you structure your data in such a way that using AI is completely unneeded, it's the perfect system to use AI on.

[–] [email protected] 20 points 4 months ago (1 children)

Wait, this is just a wiki with extra steps!

[–] [email protected] 14 points 4 months ago (1 children)
[–] [email protected] 12 points 4 months ago* (last edited 4 months ago)

I wonder why no business types/economists (*) types don't just stand up and go 'this is all a scam'. It started with just needing a website (with some added idea that it might reduce advertisement costs, replace some secretaries who used to handle information requests). Fine, you can buy those and maintenance and running costs is cheap. And then you need an ecommerce site, so you need a team of devs, admins, security people, but it brings in some revenue. But then every five years there is something else, and then you are running a huge team of STEM people who all are working on the pivot to Apps/AI/Quantum resistant whatever/NFTs/Cryptocurrencies/GDPR/continuous development/checking if none of the daily updates of all your huge dependencies break something/microservices/huge cloud bills/training of these tech people/moderation of your sites for pedophiles, racists, other extremists, and random satanic scare style worries. Bottom line somebody must have discovered something like 'wait why are we spending this much money on all this crap? We sell pet food for fucks sake, Ecommerce didn't reduce our costs as now we need to hire an maintain delivery people and cars.

*: I know why, because their jobs are also in on the scam.

[–] [email protected] 27 points 4 months ago (2 children)

a thought on this specifically:

Google Cloud Chief Evangelist Richard Seroter said he believes the desire to use tools like Gemini for Google Workspace is pushing organizations to do the type of data management work they might have been sluggish about in the past.

“If you don’t have your data house in order, AI is going to be less valuable than it would be if it was,” he said.

we're right back to "you're holding it wrong" again, i see

i'm definitely imagining Google re-whipping up their "Big Data" sales pitches in response to Gemini being borked or useless. "oh, see your problem is that you haven't modernized and empowered yourself by dumping all your databases into a (our) cloud native synergistic Data Sea, available for only $1.99/GB"

[–] [email protected] 21 points 4 months ago (5 children)

Google Cloud Chief Evangelist

That cannot be an official title someone has, can it?

[–] [email protected] 27 points 4 months ago* (last edited 4 months ago) (1 children)

It's a sad fate that sometimes befalls engineers who are good at talking to audiences, and who work for a big enough company that can afford to have that be their primary role.

edit: I love that he's chief evangelist though, like he has a bunch of little google cloud clerics running around doing chores for him.

[–] [email protected] 26 points 4 months ago (2 children)

I'm sorry but "evangelist" to me conjures an image of a techbro on a street corner with a makeshift shelf of Google Cloud brochures advertising free documentation study hours.

There's Jehowa Witnesses doing the same schtick one block over and they're absolutely furious he took their favourite spot.

[–] [email protected] 9 points 4 months ago

Going door to door to share the news of the good lord acausalrobotgod.

Fake Edit: huh, just received an EA sponsorship deal in my email.

[–] [email protected] 5 points 4 months ago (1 children)

well they literally send these people to conferences to spread the word. as @Architeuthis said, it's very real - and it's also been for quite a while now

not sure when you got into the industry but you should've seen the kind of wacky shit that went down in the 10s (which, yes, is not that long ago at all). utterly fucking bizarre.

[–] [email protected] 5 points 4 months ago (1 children)

wot zirp bayfuckery hath wrought

[–] [email protected] 5 points 4 months ago

haha, I'm glad it brainwormed you :>

[–] [email protected] 21 points 4 months ago (2 children)

this is a completely standard silicon valley job title

[–] [email protected] 18 points 4 months ago (1 children)

Did no one tell them that "evangelist" is not exactly a positive term? Do they call their sales people "crusaders", too?

[–] [email protected] 13 points 4 months ago (1 children)

i think this is a common european misconception; the us is bloody religious, even compared to our polish parishes.

(and if you want to know how the salespeople think? remember the movie glengarry glenn ross? the movie about alienation and lack of humanity? this absolutely inhumane monster is thought by them as an example of the right attitude.)

[–] [email protected] 11 points 4 months ago (3 children)

yeah apparently even ideological armpit of poland is positively liberal compared to bulk of us

maps

[–] [email protected] 9 points 4 months ago (1 children)

Georgia is 70-82% absolute certain that god exists.

Which one? Both of them.

[–] [email protected] 6 points 4 months ago (1 children)
[–] [email protected] 8 points 4 months ago

The country and the state.

[–] [email protected] 9 points 4 months ago (2 children)

@skillissuer The younger generation in the US is secularizing rapidly, though—increased radicalization of the evangelicals (and association with white supremacism/neo-nazis) is driving an exodus from churches.

[–] [email protected] 9 points 4 months ago* (last edited 4 months ago)

depending on how you count, poland might be one of fastest secularizing countries in the world and for pretty much the same reasons, we even had abortion ban. it's not a competition of course

one extra factor is backlash against massive catholic propaganda campaign from the times when john paul 2 was pope, based on some twisted logic that because pope is polish now, yall better be religious or else. nobody really was having this outside of pilgrimage-going weirdos (even if it's consequence-free school trip it's only for true believers because of intense catholic radiation). so when pope finally croaked some of these nutjobs were lighting candles on hour of his death (21:37) and singing his fave song for several days as if that was completely normal and not something on par with North Koreans mourning Dear Leader. over time it got mocked relentlessly and in 2010s era memes pope will be forever remembered as a pedo war criminal (as he was) input 2137 in search engine of your choice if you want to find out

also it'll be forever funny to me how american fundamentalist catholics try to go to "based catholic poland", a country that doesn't exist, in search of tradwife and slow life on a plot of land in the middle of nowhere. they missed a memo or several

[–] [email protected] 5 points 4 months ago

wait i just noticed that mastodon doesn't show images embedded in comments (there are maps)

[–] [email protected] 7 points 4 months ago

hey don't talk about podlasie that way.

[–] [email protected] 10 points 4 months ago* (last edited 4 months ago)

This is the moment I'll remember when future generations ask if there were any signs.

[–] [email protected] 16 points 4 months ago

What in the fuck

[–] [email protected] 10 points 4 months ago

yeah, that's the term for devrel executives.

[–] [email protected] 9 points 4 months ago

That's Reverend Doctor Seroter, thank you.

[–] [email protected] 13 points 4 months ago

Google pivoting to selling shovels for the AI gold rush in the form of data tools should be pretty viable if they commit to it, I hadn't thought if it that way.

[–] [email protected] 25 points 4 months ago (2 children)

If you don’t have your data house in order, AI is going to be less valuable than it would be if it was,” he said.

If your data house is in order, why do you need AI assistants to find your neatly organized information for you anyways?

[–] [email protected] 22 points 4 months ago

Also, speaking from experience trying to do any database work for large corporate clients, no data house is in order. It's basically saying "assume a spherical cow, then AI works".

[–] [email protected] 13 points 4 months ago* (last edited 4 months ago)

To have a dead simple UI where you, a person with no technical expertise, can ask in plain language for the data you want in the way you want them presented, along with some basic analysis that you can tell it to make it sound important. Then you tell it to turn it into an email in the style of your previous emails, send it, and take a 50min coffee break. All this allegedly with no overhead besides paying a subscription and telling your IT people to point the thing to the thing.

I mean, it would be quite something if transformers could do all that, instead of raising global temperatures to synthesize convincing looking but highly suspect messaging at best while being prone to delirium at worst.

[–] [email protected] 25 points 4 months ago* (last edited 4 months ago) (1 children)

Was wondering if they're using RaG, and they are, but in the worst possible way:

Complicating matters is the fact that Copilot doesn’t always know where to go to find an answer to a particular question, Spataro said. When asked a question about revenue, Copilot won’t necessarily know to go straight to the enterprise financial system of record rather than picking up any revenue-related numbers that appear in emails or documents, he said.

Thing might be rendered useful if you could constrain it to search a particular source or site. And even better, instead of hallucinating it could just give you a link and a citation. We could call it a search engine.

[–] [email protected] 14 points 4 months ago

If you think of LLMs as being akin to lossy text compression of a set of text, where the compression artifacts happen to also result in grammatical-looking sentences, the question you eventually end up asking is "why is the compression lossy? What if we had the same thing but it returned text from its database without chewing it up first?" and then you realize that you've come full circle and reinvented search engines

[–] [email protected] 14 points 4 months ago* (last edited 4 months ago)

ChatGPT's reaction each morning when I tell it that it's now the year 2024 and Ilya no longer works at OAI

[–] [email protected] 10 points 4 months ago

Anyone who has seen tech hype like this before knows exactly what to expect.

This is why companies should pay for experience. They don’t, and we all get to go through the funhouse again.