Staying in Ethics and Legal with ChatGPT usage?
-
Oh sweet, ChatGPT built into Edge now!
-
@Obsolesce said in Staying in Ethics and Legal with ChatGPT usage?:
Oh sweet, ChatGPT built into Edge now!
Literally on the phone talking about all the customers who have gotten infected by using Edge. It's the new attack vector. Most infections I've seen in a long time.
-
@Obsolesce said in Staying in Ethics and Legal with ChatGPT usage?:
@flaxking said in Staying in Ethics and Legal with ChatGPT usage?:
@Obsolesce said in Staying in Ethics and Legal with ChatGPT usage?:
@scottalanmiller said in Staying in Ethics and Legal with ChatGPT usage?:
@Obsolesce said in Staying in Ethics and Legal with ChatGPT usage?:
@scottalanmiller said in Staying in Ethics and Legal with ChatGPT usage?:
But not plagiarism
Right, it's never plagiarism unless you for example tell chatgpt to output copyrighted content, like tell me the exact content of xyz, such as an existing book.... If it can do that or close to that. Then use that content in the same as if you copy word for word from the book. In that case you are.
Right, exactly.
I asked ChatGPT about it directly, and it said this:
As an AI language model, I am a form of technology that generates text based on the input I receive from users. While I am not a human being, the text I generate is still considered someone else's work.
In the case of using my output as your own, it would be considered plagiarism if you do not properly attribute the text you generated through me. This is because the text I generate is based on a combination of machine learning algorithms and vast amounts of pre-existing text data that has been created by humans. Therefore, failing to properly cite the sources of the text you generate through me would be equivalent to not citing the sources of any other work created by human authors.
In summary, failing to attribute the text generated through an AI language model like myself can be considered plagiarism, even though the output comes from a machine rather than a person.
ME:
You said "can" be considered plagiarism. In what cases is it not considered plagiarism?
ChatGPT:
It's important to note that the issue of whether or not using text generated through an AI language model is considered plagiarism can be a complex one, and there is no clear-cut answer that applies to all situations. In general, however, it is not considered plagiarism to use text generated through an AI language model if:
-
You use the text only as a starting point or source of inspiration, and you substantially reword or rewrite the text in your own words.
-
You use the text as a secondary source, and you properly cite both the AI language model as the source of the text and any other sources that you drew upon to create your own work.
-
The text generated by the AI language model is in the public domain or has been released under a Creative Commons license that allows for reuse and modification.
In any case, it is always important to use your best judgment and consult with a teacher, professor, or other authority on academic integrity if you have any doubts about whether or not your use of text generated through an AI language model could be considered plagiarism.
ME:
Outside of academics is the same true?
ChatGPT :
Yes, the same principles of attribution and integrity apply outside of academics as well. While the consequences of plagiarism may not be as severe as in an academic or commercial context, it is still considered unethical and can damage your reputation and credibility in a given field.
If you are using text generated through an AI language model for non-academic purposes, it is still important to properly attribute the source of the text and obtain permission if necessary. Additionally, if you are using the text to create a product or service that you plan to sell, you should be particularly careful to ensure that you are not infringing on any copyrights or other intellectual property rights.
In summary, whether you are using text generated through an AI language model for academic or non-academic purposes, it is important to adhere to principles of attribution and integrity, and to always consult with appropriate authorities or legal experts if you have any doubts or questions.
Me:
Is it considered plagiarism outside of academics and commercial use?
ChatGPT :
Yes, using someone else's work or ideas without proper attribution or permission is still considered plagiarism even outside of academic or commercial use. Plagiarism is a violation of intellectual property rights and can be a legal issue in some cases.
For example, if you were to copy someone else's creative work, such as a piece of writing, artwork, or music, and present it as your own without permission or attribution, it would be considered plagiarism and could result in legal action being taken against you.
In general, it is important to always give credit to the original source of any work or ideas you use, whether in an academic or non-academic setting. This not only helps you avoid plagiarism but also shows respect for the work of others and can help build a stronger culture of collaboration and creativity.
--
That all said, I did confirm that although plagiarism is not a criminal or civil offense, plagiarism is illegal if it infringes an author's intellectual property rights, including copyright or trademark.
Next ask it why it doesn't give attributions for the preexisting test it uses.
Isn't cahtgpt's output its own words? A concoction of info based on lots of data.
Right, you don't attribute the people you've learned from in normal circumstances.
-
-
@flaxking said in Staying in Ethics and Legal with ChatGPT usage?:
@Obsolesce said in Staying in Ethics and Legal with ChatGPT usage?:
@scottalanmiller said in Staying in Ethics and Legal with ChatGPT usage?:
@Obsolesce said in Staying in Ethics and Legal with ChatGPT usage?:
@scottalanmiller said in Staying in Ethics and Legal with ChatGPT usage?:
But not plagiarism
Right, it's never plagiarism unless you for example tell chatgpt to output copyrighted content, like tell me the exact content of xyz, such as an existing book.... If it can do that or close to that. Then use that content in the same as if you copy word for word from the book. In that case you are.
Right, exactly.
I asked ChatGPT about it directly, and it said this:
As an AI language model, I am a form of technology that generates text based on the input I receive from users. While I am not a human being, the text I generate is still considered someone else's work.
In the case of using my output as your own, it would be considered plagiarism if you do not properly attribute the text you generated through me. This is because the text I generate is based on a combination of machine learning algorithms and vast amounts of pre-existing text data that has been created by humans. Therefore, failing to properly cite the sources of the text you generate through me would be equivalent to not citing the sources of any other work created by human authors.
In summary, failing to attribute the text generated through an AI language model like myself can be considered plagiarism, even though the output comes from a machine rather than a person.
ME:
You said "can" be considered plagiarism. In what cases is it not considered plagiarism?
ChatGPT:
It's important to note that the issue of whether or not using text generated through an AI language model is considered plagiarism can be a complex one, and there is no clear-cut answer that applies to all situations. In general, however, it is not considered plagiarism to use text generated through an AI language model if:
-
You use the text only as a starting point or source of inspiration, and you substantially reword or rewrite the text in your own words.
-
You use the text as a secondary source, and you properly cite both the AI language model as the source of the text and any other sources that you drew upon to create your own work.
-
The text generated by the AI language model is in the public domain or has been released under a Creative Commons license that allows for reuse and modification.
In any case, it is always important to use your best judgment and consult with a teacher, professor, or other authority on academic integrity if you have any doubts about whether or not your use of text generated through an AI language model could be considered plagiarism.
ME:
Outside of academics is the same true?
ChatGPT :
Yes, the same principles of attribution and integrity apply outside of academics as well. While the consequences of plagiarism may not be as severe as in an academic or commercial context, it is still considered unethical and can damage your reputation and credibility in a given field.
If you are using text generated through an AI language model for non-academic purposes, it is still important to properly attribute the source of the text and obtain permission if necessary. Additionally, if you are using the text to create a product or service that you plan to sell, you should be particularly careful to ensure that you are not infringing on any copyrights or other intellectual property rights.
In summary, whether you are using text generated through an AI language model for academic or non-academic purposes, it is important to adhere to principles of attribution and integrity, and to always consult with appropriate authorities or legal experts if you have any doubts or questions.
Me:
Is it considered plagiarism outside of academics and commercial use?
ChatGPT :
Yes, using someone else's work or ideas without proper attribution or permission is still considered plagiarism even outside of academic or commercial use. Plagiarism is a violation of intellectual property rights and can be a legal issue in some cases.
For example, if you were to copy someone else's creative work, such as a piece of writing, artwork, or music, and present it as your own without permission or attribution, it would be considered plagiarism and could result in legal action being taken against you.
In general, it is important to always give credit to the original source of any work or ideas you use, whether in an academic or non-academic setting. This not only helps you avoid plagiarism but also shows respect for the work of others and can help build a stronger culture of collaboration and creativity.
--
That all said, I did confirm that although plagiarism is not a criminal or civil offense, plagiarism is illegal if it infringes an author's intellectual property rights, including copyright or trademark.
Next ask it why it doesn't give attributions for the preexisting test it uses.
Because it doesn't make sense to. Unless you ask for an academic style output, you wouldn't expect that from any normal person.
-
-
@Obsolesce said in Staying in Ethics and Legal with ChatGPT usage?:
As an AI language model, I am a form of technology that generates text based on the input I receive from users. While I am not a human being, the text I generate is still considered someone else's work.
You could say this about Google Search or Grammerly but you don't attribute those.
Whose work is it considered to be?
I feel like this exposes the limits of AI that it doesn't understand the situation.
-
This is an important point in questioning, at a legal level, if you could claim plagerism, because ChatGPT's output cannot be copyrighted..
...
It’s unclear who can copyright or claim ownership AI-generated works. The requester, who simply used a tool to generate text, or OpenAI? Who?For a work to enjoy copyright protection under current U.S. law, “the work must be the result of original and creative authorship by a human author,” says Margaret Esquenet, partner with Finnegan, Henderson, Farabow, Garrett & Dunner, LLP. “Absent human creative input, a work is not entitled to copyright protection. As a result, the U.S. Copyright Office will not register a work that was created by an autonomous artificial intelligence tool.”
...Plagiarism and copyright are not synonymous so this is only suggestive. But the key is differentiating the output of another person and the output of a research and/or writing tool ChatGPT is like a spell checker, search engine, Grammary (combined), it is a tool and therefore not eligible for plagiarism OR copyright concerns. Defining it as "AI" is confusing, you can call anything AI that adds some degree of processing. ChatGPT is the best one publicly available so far, by far, but that's all. It's not a new tool, just way better than the old ones. We've already established that these kinds of tools are clear, there's no new discussion here that I'm aware of... only that the line between "a professor can reliably detect it" has been crossed.
-
There are good arguments against the following, but good arguments for it as well. From legal reviews..
A compelling argument, Kelber adds, “may be made that AI is simply a tool and that the human who is directing the AI should be able to claim ownership of the output. For example, a graphic artist can claim artwork made through the use of drawing software.
-
Mirriam Webster:
#########
plagiarized; plagiarizing
Synonyms of plagiarizeto steal and pass off (the ideas or words of another) as one's own - use (another's production) without crediting the source
##########
So if we look at how this wording matches the US copyright law... the AI is determined under law to not be a person. In order to plagiarize, one must take from another person, not from a tool or machine. So the copyright office has been super clear that humanity is a requirement in copyright ownership. Plagiarism, by the most common American dictionary, follows in kind. Seems black and white to me. Since the AI is the source of the text, unless it itself is plagiarizing (which it is supposed to be trained not to do), the concept of plagiarism cannot apply as it has the same burden of humanity as does copyright.
The logic that ChatGPT gives for why it suggests you would need to attribute it does not hold up to the definition and would be considered irrelevant. Which, of course, as a nascent AI, we expect a large degree of errors. And it is trained on a lot of questionable information.
So we can talk about appropriate or inappropriate uses of search engine or writing tooling, but it seems that copyright or plagiarism are clearly off of the table. Simple definition precludes them from any discussion involving AI.
-
@Obsolesce said in Staying in Ethics and Legal with ChatGPT usage?:
In any case, it is always important to use your best judgment and consult with a teacher, professor, or other authority on academic integrity if you have any doubts about whether or not your use of text generated through an AI language model could be considered plagiarism.
So basically it is saying you need to verify if your "authority" figure is being honest or just going to make up rules of their own and not abide by the English language. The question isn't whether the use is plagiarism, but whether a corrupt person will misuse the term for personal gain (e.g. professors looking for easy answers.)
As someone who has reported teaching staff for academic dishonest and been told that academic honest is only for students not for the university staff, I have little allowance for dishonest educators.
-
@scottalanmiller said in Staying in Ethics and Legal with ChatGPT usage?:
@Obsolesce said in Staying in Ethics and Legal with ChatGPT usage?:
Oh sweet, ChatGPT built into Edge now!
Literally on the phone talking about all the customers who have gotten infected by using Edge. It's the new attack vector. Most infections I've seen in a long time.
That sounds more like the kind of situation being those people would have gotten infected just the same regardless of web browser used. Latest version of the web browser prior to infection?
-
@Obsolesce said in Staying in Ethics and Legal with ChatGPT usage?:
That sounds more like the kind of situation being those people would have gotten infected just the same regardless of web browser used. Latest version of the web browser prior to infection?
As far as we can tell. It's on managed systems that are automatically updated, AV is up to date and active, firewall is on. But just takes clicking on something.
We project that Edge puts people at additional risk because it is the default product on the most insecure platform, that is also a default choice. It makes it super likely that your target is "accepting everything because it is default" rather than being thoughtful in their technology choices. It makes it an ideal public "flag" to make someone a higher than average potential for malware.
-
@scottalanmiller said in Staying in Ethics and Legal with ChatGPT usage?:
@Obsolesce said in Staying in Ethics and Legal with ChatGPT usage?:
If your English professor wants you to write an essay, and you didn't write it, then I see a problem.
Problem, yes. But not plagiarism. And not quite cheating, either. It's a weird grey area. Because it's a universal tool.
So, no ownership? It is cheating plain and simple. No different than looking over another's shoulder to pick up on what they are doing to write a test.
Have we really gone that far that "grey" justifies virtually any kind of behaviour? No culpability? No ownership? No responsibility for one's actions?
Wow Scott. That's so sad and a total antithesis to what we've taught our kids in our home school.
That's like having a bot lift weights for me then going home and telling my wife that I did the required exercise for the day.
No way. That's just bunk.
-
@PhlipElder said in Staying in Ethics and Legal with ChatGPT usage?:
It is cheating plain and simple. No different than looking over another's shoulder to pick up on what they are doing to write a test.
How do you come to that conclusion? Where does the cheating come from? From whom are you taking the content? No one. It's a tool.
From this logic, how do you allow spellcheckers, Grammarly and other forms of "cheating" on the parts that don't matter?
I would say, by definition, if you consider this cheating, you can only do so by making the project the busy work and not the output. Basically defining education as the avoidance of learning or value, rather than the increase of it.
-
@PhlipElder said in Staying in Ethics and Legal with ChatGPT usage?:
So, no ownership?
Correct, in the same way that a typewriter, calculator, spell checker or Grammar assistant do not own the work that they help you to create. It still requires a human to operate those tools, and this one. No different.
-
@PhlipElder said in Staying in Ethics and Legal with ChatGPT usage?:
Have we really gone that far that "grey" justifies virtually any kind of behaviour? No culpability? No ownership? No responsibility for one's actions?
Grey? What's grey? Using tools to write isn't just the basis for human improvement, the purpose of education is to teach humans to excel at the portions that we don't have tools to do.
What you say doesn't make sense. Culpability for doing the right thing? Responsibly using available tools so that you can focus on educational value? What kind of culpability is that?
You are using "cheating" as a foregone conclusion. But I can't even find where there is a basis for the conversation. How is using a writing tool ever been cheating, and where else is it cheating? I don't see any component here that would qualify as a piece to see as grey. It's black and white, this is a tool, there's no responsibility and no culpability because its the RIGHT thing to do.
If you are writing papers and NOT using the available tools, aren't you just wasting time and admitting that the point is to waste time rather than to grow? Who is responsible for that? Who is culpable for that approach to "education?"
-
@PhlipElder said in Staying in Ethics and Legal with ChatGPT usage?:
That's like having a bot lift weights for me then going home and telling my wife that I did the required exercise for the day.
No way. That's just bunk.In your example, having a bot move weights does not accomplish the goal. In the use of ChatGPT, it does. So polar opposite things.
If your goal was to "move weights around", basically to do busy work to waste your time, then yes, using a bot to do it would be the most responsible approach. Why would a human waste time doing something of no or worse, negative, value? Just to try to excuse the expenditure of time without needing to engage their brains.
That's what professors are doing. Everything you describe sounds like you are upset that we are exposing the education system. But you are blaming the students for exposing it rather than the professors and teachers who haven't been doing their jobs all along.
-
@PhlipElder said in Staying in Ethics and Legal with ChatGPT usage?:
Have we really gone that far that "grey" justifies virtually any kind of behaviour?
Try it in reverse. Try to justify to me being a PhD student or an employee and having access to ChatGPT and not using it. As an educator and employer, I see avoiding the use of the available tools as lazy and wrong. If you feel that it is justified to excuse the grey area of doing manual work where none is needed and doesn't add value (in my estimation) then explain to me the opposite.... how do you even excuse not using the available tools and just filling the student and/or employee's time with pointless busywork?
-
Just to be fair, I understand that you can justify acting unethically or lazily or in a "grey area" by saying you have incompetent or unethical professors, or your job is worthless and the goal is to waste time. But assume ethical, competent educators - meaning they are there for you to grow your potential and learn that which is valuable instead of using busywork as a way to hide that they are not doing their job; or ethical managers at work who want the company to maximize profits not keep unnecessary workers busy doing pointless tasks to appear like they need more headcount.
-
@scottalanmiller said in Staying in Ethics and Legal with ChatGPT usage?:
@PhlipElder said in Staying in Ethics and Legal with ChatGPT usage?:
So, no ownership?
Correct, in the same way that a typewriter, calculator, spell checker or Grammar assistant do not own the work that they help you to create. It still requires a human to operate those tools, and this one. No different.
In those examples the original content comes from the mind of the one hitting the keys.
Original, as in created, as in inspired by and written down, as in it came from the person themselves not some machine.
Seriously Scott?
What's the saying? "Possession is 9/10ths of the law." ?
Having a machine spit out content then presenting it as something I created is a lie.
A shovel is a tool. A screwdriver is a tool. A computer is a tool.
Content is the creator's own.
-
@PhlipElder said in Staying in Ethics and Legal with ChatGPT usage?:
In those examples the original content comes from the mind of the one hitting the keys.
To some degree, but not entirely. And the same is true of ChatGPT. You still need a competent operator to make it produce useful output. The average person can't operate it to get a good PhD thesis, for example. So it still comes from the mind of the operator of the tool.