Many workers are faking knowledge of AI to make sure they aren’t left behind::There’s a need for more AI training, report finds
Fake technologies, fake experts.
I have a theoretical degree in physics.
Fantastic, you’re hired!
I have a poweplant for you to run, it’s near a casino it will be great!
Seems to me that while companies are bullshitting calling generic algorithms AI, it’s fine for the potentially employed to do the same.
I read an article the other day where an airline was breaking about using AI to predict how many passengers will buy a meal in flight based on how many people had historically bought a meal in flight.
That’s… Literally just an average of how many people order a meal…
Ehhhhh there are much more sophisticated models than just an average. What a neural network could do is derive inferences based on a wide variety of inputs like time of day, country of origin, individual passenger characteristics, and so on.
Ultimately that application is just averaging over a smaller subset.
While admittedly I don’t know that scenario myself, it looks like several scenarios I’ve seen where we imagined some magic insight from AI over more limited statistics, but not one of those scenarios ever predicted better.
That’s not to say AI approaches are useless, but this sort of data when the dataset is pretty well organized and the required predictions are straightforward, then a pretty simple statistical analysis is plenty, and declaring “AI” for such a simple scenario just undermines AI credibility where it can do formerly infeasible things.
You can basically think of AI as a massively multiverse analysis that can go far beyond a directly applied model. So while yes, technically averages are involved, they’re applied in a way that makes it incredibly naive to call it “just averages”.
Edit: it is especially not “just an average of how many people order a meal” as you had said.
AI models are averages, except in the form of weights over a large set of matrices. However, calling them “just averaging” is grossly oversimplifying how they work.
Most “AI” is just outcomes from machine learning.
And what are you ? Some statistics wizard ??
Many postings are like “10 years experience in ChatGPT” anyway so it balances out.
I am so glad I am not longer only doing software development. The interview process alone is hell. We need someone with a decade experience in this one particular framework that only five companies on earth use. If you have a tech stack that is so widely not the norm that you can’t find people who know it consider changing it.
Or accept that you need to train people on the job.
The hiring process should focus on expertise in the language and the ability to program, not the framework. Knowledge in frameworks are just a nice bonus, and typically easy to figure out with an adapt programmer.
That usually works out…but to be honest some frameworks can be a big headache…
My management assigned me a title implying high level AI person. Evidently they had a mandate for X% AI experts, so a bunch of us had our titles arbitrarily changed and the mandate was satisfied.
No one can tell we aren’t, so I guess it worked out?
Yeah… Not exactly the same but I have seen for getting contracts they put some workers that really have the certificates or whatever needed, so they can get the contract but those guys ain’t gonna touch that work at all since they are in other projects somebody else / team will work on it.
I guess sis not that bad since if they really need something they could eventually ask this either guys to help but…
I mean, if I know what a quadratic function and a gradient descent are, does this mean I have knowledge of AI? Then most people can be taught that in an hour (if they are sufficiently dense).
This is the way.
“Of course I’ve heard of AI. What a silly thing to ask! 😅”
Later, furiously looking up what AI even is
It’s what I do about every other technology at work. “Fake it until you make it”. Hopefully that day comes
Fake it till you make it 😂
Well, they shouldn’t be. AI while often helpful in some use cases, is not really that powerful like some marketers want you to believe, and very often giving useless or just false data (for example when it comes to chatbots).
If you’d believe everything what they say you would come to conclusion that we should all prepare for singularity or some other shit coming this year or next, which is just bullshit.