Experts Exchange
--
Questions
--
Followers
Top Experts
Random AI question.
Will ChatGPT, OpenAI, GPT3 and AI's like these in future outsmart communities like EE and Stack Overflows ? Do we need to make some change on current model to compete with things like ChatGPT and google's AI ?
Thanks
Zero AI Policy
We believe in human intelligence. Our moderation policy strictly prohibits the use of LLM content in our Q&A threads.
I used to think, computers might able to ditto copy what Picasso created but it can't draw it's own abstract art or drawings.
I used to think, many things which would seem silly and idiotic to mention over here at 01/26/2023.
So
My personal thought is "Singularity is near"
I support the first comment. As I'm knowledgeable in areas like Linux and scripting, I think there's only so much an AI can do there. When members ask questions, clarification is needed often, that will work best by communicating between members and askers. I don't see an AI ask you what do you mean by .... to get a better description of the question. The AI would given you an answer but has it understood what you really want and is the answer matching the requirement? I've seen questions about creating shell scripts that go back and forth where members keep asking for details about data structure being processed, edge conditions and so on.






EARN REWARDS FOR ASKING, ANSWERING, AND MORE.
Earn free swag for participating on the platform.
I believe EE, Stack and the others are safe for quite a while.
Long version:
Like Munib's article, I too remember the early 90's when 4GL was being talked about as the next big thing and will make developers obsolete.
In 2023 we still have jobs for developers.
We tried 4GL and it wrote crap code.
I did a college project with LISP (https://en.wikipedia.org/wiki/Lisp_(programming_language)) since we had it on our college computer system just to see what it could do. Wasn't overly impressed. Sure you set up all the rules and it was pretty good at figuring things out but it is still constrained by rules. Outside of the box thinking wasn't really allowed.
While programming in general is problem solving and AI is really good at problem solving, I don't believe AI will ever be able to write good code in my lifetime (but I'm also kinda old).
Do developers even write good code these days? Many don't, myself included at times. They write functional code but most of the time it isn't as efficient as it could be. Computer power and capacity compensates for bad code so most of the time no one cares.
So, who knows, maybe AI can write code like many developers right now.
When you first start coding in some new language, you write garbage code. Sure, it works but it is rarely considered 'good'.
After we've been using a particular language for a while, hopefully, we've ALL gone back and looked at code we wrote a year ago and thought: What dumba$$ wrote that!? I do that about daily!
For one, I cannot wait for our computer overlords to take over but figure I'll have to continue to show up to work for a while....
Why do I comment in this old thread now? Because I suspect Anir is experimenting in https://www.experts-exchange.com/dashboard/#/questions/contribute/29254838 , they have given a clue in the code box and I repeated the experiment by pasting the question of that thread into ChatGPT and got a very similar answer, some paragraphs being identical.
I for one will be adding "why is a mouse when it spins?" as a paragraph in any question I ask at EE in future because that completely stumps ChatGPT and it's a lot easier than EE implementing an AI content detector.
As far as coding goes, when you know what you are looking for and have a good overall understanding of what it is you are looking for, I find ChatGBT (I actually use paid Jasper), is much quicker than a search engine and for short bursts of code, the chat does a pretty good job of understanding what you want and providing some quick code or whatever you need.
The problem is when you really do not have a good grasp such as something new, and many people, "Don't know what they don't know", is when you can get in trouble. 90% of getting a good answer is being able to formulate a good question. And I find more times than not what we do here on EE is to help identify what the question author is really asking and then base the answer on that. Many questions are asked with partial solutions in mind and those partial solutions are throwing them off to start with. So asking that type of question to AI is going to put you in the wrong direction.
We need to embrace AI. I would like to see some AI generated solutions off to the side labeled,"AI Generated" just to prevent other members from copy/pasting what they threw in AI when they didn't really understand what they are posting. And that could potentially get things moving or let live members comment if that is workable or not.

Get a FREE t-shirt when you ask your first question.
We believe in human intelligence. Our moderation policy strictly prohibits the use of LLM content in our Q&A threads.
Experts Exchange
--
Questions
--
Followers
Top Experts
This is the Experts Exchange Community Hub. Check here for updates from EE and news about our community and members!