Staying in Ethics and Legal with ChatGPT usage?
-
@Obsolesce Here is where it gets tricky in the homework or essay situation...
- Can you use tools like paper and pen? What about a typewriter? What about a word processor on a computer? What about a spell checker? What about a grammar processor like Grammarly (this is built into Zoho, for example.) What about something like Google Translate to help with language barriers?
- Can you use books as a reference? What about speeches? Or conversations? Conferences? Internet articles?
In 99.999% of cases, we establish that it is allowed to use extensive technological tools to improve our English writing. This is so understood to be how people write today that we don't even mention them and we expect that they will be use and grading curves depend on you using those tools or you are at a huge disadvantage (how can someone spell THAT wrong, didn't they use a spell checker?)
In 99.999% of cases, we also establish that all the info on the Internet (and elsewhere) is fair game as reference material, with tons of it being generated by humans (real intelligence) or automation (AI). So we are already using tools like, and in many cases better, than ChatGPT to produce the research materials.
So the use of tooling, like ChatGPT, is so ubiquitous that we have it not so much as a foregone conclusion that it will be used, but essentially a requirement (try getting a good grade without using those resources today - the professors are trained to expect it as every single student uses them and it creates a baseline of quality.) ChatGPT is essentially just an extremely advanced version of the above put together. It's a computerized tool for language checking AND for information searching, just combined into one (Google is already this, too.)
There's no single aspect of ChatGPT that isn't covered by how people do essays already. It's only that it is vastly better at it.
If you look at it from a kid in 1980's perspective, Google's web search is already so advanced and does so much AI and the work for you that 90% or more of the work that was considered a requirement for writing essays back then is all automated today. And Grammarly replaces so much of the English training knowledge.
I think you'd find from a historic perspective that this isn't the huge game changing moment, but just a line along a continuum.
And one can easily argue, that a student not using the industry standard tools for research and writing is being lazy and foolish because the point of school is to train you and if you are doing work that will never be required in the real world, that's not teaching you useful skills. The bigger skill today is learning how to use tools like ChatGPT effectively and how to check their output to see if it is good.
-
@scottalanmiller said in Staying in Ethics and Legal with ChatGPT usage?:
But not plagiarism
Right, it's never plagiarism unless you for example tell chatgpt to output copyrighted content, like tell me the exact content of xyz, such as an existing book.... If it can do that or close to that. Then use that content in the same as if you copy word for word from the book. In that case you are.
-
@Obsolesce said in Staying in Ethics and Legal with ChatGPT usage?:
@scottalanmiller said in Staying in Ethics and Legal with ChatGPT usage?:
But not plagiarism
Right, it's never plagiarism unless you for example tell chatgpt to output copyrighted content, like tell me the exact content of xyz, such as an existing book.... If it can do that or close to that. Then use that content in the same as if you copy word for word from the book. In that case you are.
Right, exactly.
-
This would be a good lesson in laws and ethics. Sadly the laws have not caught up to this new technology. So at this point is pretty much any bodies guess on how the law is going to proceed when it relates to copyright and plagiarism.
-
@scottalanmiller said in Staying in Ethics and Legal with ChatGPT usage?:
@Obsolesce said in Staying in Ethics and Legal with ChatGPT usage?:
@scottalanmiller said in Staying in Ethics and Legal with ChatGPT usage?:
But not plagiarism
Right, it's never plagiarism unless you for example tell chatgpt to output copyrighted content, like tell me the exact content of xyz, such as an existing book.... If it can do that or close to that. Then use that content in the same as if you copy word for word from the book. In that case you are.
Right, exactly.
I asked ChatGPT about it directly, and it said this:
As an AI language model, I am a form of technology that generates text based on the input I receive from users. While I am not a human being, the text I generate is still considered someone else's work.
In the case of using my output as your own, it would be considered plagiarism if you do not properly attribute the text you generated through me. This is because the text I generate is based on a combination of machine learning algorithms and vast amounts of pre-existing text data that has been created by humans. Therefore, failing to properly cite the sources of the text you generate through me would be equivalent to not citing the sources of any other work created by human authors.
In summary, failing to attribute the text generated through an AI language model like myself can be considered plagiarism, even though the output comes from a machine rather than a person.
ME:
You said "can" be considered plagiarism. In what cases is it not considered plagiarism?
ChatGPT:
It's important to note that the issue of whether or not using text generated through an AI language model is considered plagiarism can be a complex one, and there is no clear-cut answer that applies to all situations. In general, however, it is not considered plagiarism to use text generated through an AI language model if:
-
You use the text only as a starting point or source of inspiration, and you substantially reword or rewrite the text in your own words.
-
You use the text as a secondary source, and you properly cite both the AI language model as the source of the text and any other sources that you drew upon to create your own work.
-
The text generated by the AI language model is in the public domain or has been released under a Creative Commons license that allows for reuse and modification.
In any case, it is always important to use your best judgment and consult with a teacher, professor, or other authority on academic integrity if you have any doubts about whether or not your use of text generated through an AI language model could be considered plagiarism.
ME:
Outside of academics is the same true?
ChatGPT :
Yes, the same principles of attribution and integrity apply outside of academics as well. While the consequences of plagiarism may not be as severe as in an academic or commercial context, it is still considered unethical and can damage your reputation and credibility in a given field.
If you are using text generated through an AI language model for non-academic purposes, it is still important to properly attribute the source of the text and obtain permission if necessary. Additionally, if you are using the text to create a product or service that you plan to sell, you should be particularly careful to ensure that you are not infringing on any copyrights or other intellectual property rights.
In summary, whether you are using text generated through an AI language model for academic or non-academic purposes, it is important to adhere to principles of attribution and integrity, and to always consult with appropriate authorities or legal experts if you have any doubts or questions.
Me:
Is it considered plagiarism outside of academics and commercial use?
ChatGPT :
Yes, using someone else's work or ideas without proper attribution or permission is still considered plagiarism even outside of academic or commercial use. Plagiarism is a violation of intellectual property rights and can be a legal issue in some cases.
For example, if you were to copy someone else's creative work, such as a piece of writing, artwork, or music, and present it as your own without permission or attribution, it would be considered plagiarism and could result in legal action being taken against you.
In general, it is important to always give credit to the original source of any work or ideas you use, whether in an academic or non-academic setting. This not only helps you avoid plagiarism but also shows respect for the work of others and can help build a stronger culture of collaboration and creativity.
--
That all said, I did confirm that although plagiarism is not a criminal or civil offense, plagiarism is illegal if it infringes an author's intellectual property rights, including copyright or trademark.
-
-
@Obsolesce said in Staying in Ethics and Legal with ChatGPT usage?:
@scottalanmiller said in Staying in Ethics and Legal with ChatGPT usage?:
@Obsolesce said in Staying in Ethics and Legal with ChatGPT usage?:
@scottalanmiller said in Staying in Ethics and Legal with ChatGPT usage?:
But not plagiarism
Right, it's never plagiarism unless you for example tell chatgpt to output copyrighted content, like tell me the exact content of xyz, such as an existing book.... If it can do that or close to that. Then use that content in the same as if you copy word for word from the book. In that case you are.
Right, exactly.
I asked ChatGPT about it directly, and it said this:
As an AI language model, I am a form of technology that generates text based on the input I receive from users. While I am not a human being, the text I generate is still considered someone else's work.
In the case of using my output as your own, it would be considered plagiarism if you do not properly attribute the text you generated through me. This is because the text I generate is based on a combination of machine learning algorithms and vast amounts of pre-existing text data that has been created by humans. Therefore, failing to properly cite the sources of the text you generate through me would be equivalent to not citing the sources of any other work created by human authors.
In summary, failing to attribute the text generated through an AI language model like myself can be considered plagiarism, even though the output comes from a machine rather than a person.
ME:
You said "can" be considered plagiarism. In what cases is it not considered plagiarism?
ChatGPT:
It's important to note that the issue of whether or not using text generated through an AI language model is considered plagiarism can be a complex one, and there is no clear-cut answer that applies to all situations. In general, however, it is not considered plagiarism to use text generated through an AI language model if:
-
You use the text only as a starting point or source of inspiration, and you substantially reword or rewrite the text in your own words.
-
You use the text as a secondary source, and you properly cite both the AI language model as the source of the text and any other sources that you drew upon to create your own work.
-
The text generated by the AI language model is in the public domain or has been released under a Creative Commons license that allows for reuse and modification.
In any case, it is always important to use your best judgment and consult with a teacher, professor, or other authority on academic integrity if you have any doubts about whether or not your use of text generated through an AI language model could be considered plagiarism.
ME:
Outside of academics is the same true?
ChatGPT :
Yes, the same principles of attribution and integrity apply outside of academics as well. While the consequences of plagiarism may not be as severe as in an academic or commercial context, it is still considered unethical and can damage your reputation and credibility in a given field.
If you are using text generated through an AI language model for non-academic purposes, it is still important to properly attribute the source of the text and obtain permission if necessary. Additionally, if you are using the text to create a product or service that you plan to sell, you should be particularly careful to ensure that you are not infringing on any copyrights or other intellectual property rights.
In summary, whether you are using text generated through an AI language model for academic or non-academic purposes, it is important to adhere to principles of attribution and integrity, and to always consult with appropriate authorities or legal experts if you have any doubts or questions.
Me:
Is it considered plagiarism outside of academics and commercial use?
ChatGPT :
Yes, using someone else's work or ideas without proper attribution or permission is still considered plagiarism even outside of academic or commercial use. Plagiarism is a violation of intellectual property rights and can be a legal issue in some cases.
For example, if you were to copy someone else's creative work, such as a piece of writing, artwork, or music, and present it as your own without permission or attribution, it would be considered plagiarism and could result in legal action being taken against you.
In general, it is important to always give credit to the original source of any work or ideas you use, whether in an academic or non-academic setting. This not only helps you avoid plagiarism but also shows respect for the work of others and can help build a stronger culture of collaboration and creativity.
--
That all said, I did confirm that although plagiarism is not a criminal or civil offense, plagiarism is illegal if it infringes an author's intellectual property rights, including copyright or trademark.
Next ask it why it doesn't give attributions for the preexisting test it uses.
-
-
@flaxking said in Staying in Ethics and Legal with ChatGPT usage?:
@Obsolesce said in Staying in Ethics and Legal with ChatGPT usage?:
@scottalanmiller said in Staying in Ethics and Legal with ChatGPT usage?:
@Obsolesce said in Staying in Ethics and Legal with ChatGPT usage?:
@scottalanmiller said in Staying in Ethics and Legal with ChatGPT usage?:
But not plagiarism
Right, it's never plagiarism unless you for example tell chatgpt to output copyrighted content, like tell me the exact content of xyz, such as an existing book.... If it can do that or close to that. Then use that content in the same as if you copy word for word from the book. In that case you are.
Right, exactly.
I asked ChatGPT about it directly, and it said this:
As an AI language model, I am a form of technology that generates text based on the input I receive from users. While I am not a human being, the text I generate is still considered someone else's work.
In the case of using my output as your own, it would be considered plagiarism if you do not properly attribute the text you generated through me. This is because the text I generate is based on a combination of machine learning algorithms and vast amounts of pre-existing text data that has been created by humans. Therefore, failing to properly cite the sources of the text you generate through me would be equivalent to not citing the sources of any other work created by human authors.
In summary, failing to attribute the text generated through an AI language model like myself can be considered plagiarism, even though the output comes from a machine rather than a person.
ME:
You said "can" be considered plagiarism. In what cases is it not considered plagiarism?
ChatGPT:
It's important to note that the issue of whether or not using text generated through an AI language model is considered plagiarism can be a complex one, and there is no clear-cut answer that applies to all situations. In general, however, it is not considered plagiarism to use text generated through an AI language model if:
-
You use the text only as a starting point or source of inspiration, and you substantially reword or rewrite the text in your own words.
-
You use the text as a secondary source, and you properly cite both the AI language model as the source of the text and any other sources that you drew upon to create your own work.
-
The text generated by the AI language model is in the public domain or has been released under a Creative Commons license that allows for reuse and modification.
In any case, it is always important to use your best judgment and consult with a teacher, professor, or other authority on academic integrity if you have any doubts about whether or not your use of text generated through an AI language model could be considered plagiarism.
ME:
Outside of academics is the same true?
ChatGPT :
Yes, the same principles of attribution and integrity apply outside of academics as well. While the consequences of plagiarism may not be as severe as in an academic or commercial context, it is still considered unethical and can damage your reputation and credibility in a given field.
If you are using text generated through an AI language model for non-academic purposes, it is still important to properly attribute the source of the text and obtain permission if necessary. Additionally, if you are using the text to create a product or service that you plan to sell, you should be particularly careful to ensure that you are not infringing on any copyrights or other intellectual property rights.
In summary, whether you are using text generated through an AI language model for academic or non-academic purposes, it is important to adhere to principles of attribution and integrity, and to always consult with appropriate authorities or legal experts if you have any doubts or questions.
Me:
Is it considered plagiarism outside of academics and commercial use?
ChatGPT :
Yes, using someone else's work or ideas without proper attribution or permission is still considered plagiarism even outside of academic or commercial use. Plagiarism is a violation of intellectual property rights and can be a legal issue in some cases.
For example, if you were to copy someone else's creative work, such as a piece of writing, artwork, or music, and present it as your own without permission or attribution, it would be considered plagiarism and could result in legal action being taken against you.
In general, it is important to always give credit to the original source of any work or ideas you use, whether in an academic or non-academic setting. This not only helps you avoid plagiarism but also shows respect for the work of others and can help build a stronger culture of collaboration and creativity.
--
That all said, I did confirm that although plagiarism is not a criminal or civil offense, plagiarism is illegal if it infringes an author's intellectual property rights, including copyright or trademark.
Next ask it why it doesn't give attributions for the preexisting test it uses.
Isn't cahtgpt's output its own words? A concoction of info based on lots of data.
-
-
Oh sweet, ChatGPT built into Edge now!
-
@Obsolesce said in Staying in Ethics and Legal with ChatGPT usage?:
Oh sweet, ChatGPT built into Edge now!
Literally on the phone talking about all the customers who have gotten infected by using Edge. It's the new attack vector. Most infections I've seen in a long time.
-
@Obsolesce said in Staying in Ethics and Legal with ChatGPT usage?:
@flaxking said in Staying in Ethics and Legal with ChatGPT usage?:
@Obsolesce said in Staying in Ethics and Legal with ChatGPT usage?:
@scottalanmiller said in Staying in Ethics and Legal with ChatGPT usage?:
@Obsolesce said in Staying in Ethics and Legal with ChatGPT usage?:
@scottalanmiller said in Staying in Ethics and Legal with ChatGPT usage?:
But not plagiarism
Right, it's never plagiarism unless you for example tell chatgpt to output copyrighted content, like tell me the exact content of xyz, such as an existing book.... If it can do that or close to that. Then use that content in the same as if you copy word for word from the book. In that case you are.
Right, exactly.
I asked ChatGPT about it directly, and it said this:
As an AI language model, I am a form of technology that generates text based on the input I receive from users. While I am not a human being, the text I generate is still considered someone else's work.
In the case of using my output as your own, it would be considered plagiarism if you do not properly attribute the text you generated through me. This is because the text I generate is based on a combination of machine learning algorithms and vast amounts of pre-existing text data that has been created by humans. Therefore, failing to properly cite the sources of the text you generate through me would be equivalent to not citing the sources of any other work created by human authors.
In summary, failing to attribute the text generated through an AI language model like myself can be considered plagiarism, even though the output comes from a machine rather than a person.
ME:
You said "can" be considered plagiarism. In what cases is it not considered plagiarism?
ChatGPT:
It's important to note that the issue of whether or not using text generated through an AI language model is considered plagiarism can be a complex one, and there is no clear-cut answer that applies to all situations. In general, however, it is not considered plagiarism to use text generated through an AI language model if:
-
You use the text only as a starting point or source of inspiration, and you substantially reword or rewrite the text in your own words.
-
You use the text as a secondary source, and you properly cite both the AI language model as the source of the text and any other sources that you drew upon to create your own work.
-
The text generated by the AI language model is in the public domain or has been released under a Creative Commons license that allows for reuse and modification.
In any case, it is always important to use your best judgment and consult with a teacher, professor, or other authority on academic integrity if you have any doubts about whether or not your use of text generated through an AI language model could be considered plagiarism.
ME:
Outside of academics is the same true?
ChatGPT :
Yes, the same principles of attribution and integrity apply outside of academics as well. While the consequences of plagiarism may not be as severe as in an academic or commercial context, it is still considered unethical and can damage your reputation and credibility in a given field.
If you are using text generated through an AI language model for non-academic purposes, it is still important to properly attribute the source of the text and obtain permission if necessary. Additionally, if you are using the text to create a product or service that you plan to sell, you should be particularly careful to ensure that you are not infringing on any copyrights or other intellectual property rights.
In summary, whether you are using text generated through an AI language model for academic or non-academic purposes, it is important to adhere to principles of attribution and integrity, and to always consult with appropriate authorities or legal experts if you have any doubts or questions.
Me:
Is it considered plagiarism outside of academics and commercial use?
ChatGPT :
Yes, using someone else's work or ideas without proper attribution or permission is still considered plagiarism even outside of academic or commercial use. Plagiarism is a violation of intellectual property rights and can be a legal issue in some cases.
For example, if you were to copy someone else's creative work, such as a piece of writing, artwork, or music, and present it as your own without permission or attribution, it would be considered plagiarism and could result in legal action being taken against you.
In general, it is important to always give credit to the original source of any work or ideas you use, whether in an academic or non-academic setting. This not only helps you avoid plagiarism but also shows respect for the work of others and can help build a stronger culture of collaboration and creativity.
--
That all said, I did confirm that although plagiarism is not a criminal or civil offense, plagiarism is illegal if it infringes an author's intellectual property rights, including copyright or trademark.
Next ask it why it doesn't give attributions for the preexisting test it uses.
Isn't cahtgpt's output its own words? A concoction of info based on lots of data.
Right, you don't attribute the people you've learned from in normal circumstances.
-
-
@flaxking said in Staying in Ethics and Legal with ChatGPT usage?:
@Obsolesce said in Staying in Ethics and Legal with ChatGPT usage?:
@scottalanmiller said in Staying in Ethics and Legal with ChatGPT usage?:
@Obsolesce said in Staying in Ethics and Legal with ChatGPT usage?:
@scottalanmiller said in Staying in Ethics and Legal with ChatGPT usage?:
But not plagiarism
Right, it's never plagiarism unless you for example tell chatgpt to output copyrighted content, like tell me the exact content of xyz, such as an existing book.... If it can do that or close to that. Then use that content in the same as if you copy word for word from the book. In that case you are.
Right, exactly.
I asked ChatGPT about it directly, and it said this:
As an AI language model, I am a form of technology that generates text based on the input I receive from users. While I am not a human being, the text I generate is still considered someone else's work.
In the case of using my output as your own, it would be considered plagiarism if you do not properly attribute the text you generated through me. This is because the text I generate is based on a combination of machine learning algorithms and vast amounts of pre-existing text data that has been created by humans. Therefore, failing to properly cite the sources of the text you generate through me would be equivalent to not citing the sources of any other work created by human authors.
In summary, failing to attribute the text generated through an AI language model like myself can be considered plagiarism, even though the output comes from a machine rather than a person.
ME:
You said "can" be considered plagiarism. In what cases is it not considered plagiarism?
ChatGPT:
It's important to note that the issue of whether or not using text generated through an AI language model is considered plagiarism can be a complex one, and there is no clear-cut answer that applies to all situations. In general, however, it is not considered plagiarism to use text generated through an AI language model if:
-
You use the text only as a starting point or source of inspiration, and you substantially reword or rewrite the text in your own words.
-
You use the text as a secondary source, and you properly cite both the AI language model as the source of the text and any other sources that you drew upon to create your own work.
-
The text generated by the AI language model is in the public domain or has been released under a Creative Commons license that allows for reuse and modification.
In any case, it is always important to use your best judgment and consult with a teacher, professor, or other authority on academic integrity if you have any doubts about whether or not your use of text generated through an AI language model could be considered plagiarism.
ME:
Outside of academics is the same true?
ChatGPT :
Yes, the same principles of attribution and integrity apply outside of academics as well. While the consequences of plagiarism may not be as severe as in an academic or commercial context, it is still considered unethical and can damage your reputation and credibility in a given field.
If you are using text generated through an AI language model for non-academic purposes, it is still important to properly attribute the source of the text and obtain permission if necessary. Additionally, if you are using the text to create a product or service that you plan to sell, you should be particularly careful to ensure that you are not infringing on any copyrights or other intellectual property rights.
In summary, whether you are using text generated through an AI language model for academic or non-academic purposes, it is important to adhere to principles of attribution and integrity, and to always consult with appropriate authorities or legal experts if you have any doubts or questions.
Me:
Is it considered plagiarism outside of academics and commercial use?
ChatGPT :
Yes, using someone else's work or ideas without proper attribution or permission is still considered plagiarism even outside of academic or commercial use. Plagiarism is a violation of intellectual property rights and can be a legal issue in some cases.
For example, if you were to copy someone else's creative work, such as a piece of writing, artwork, or music, and present it as your own without permission or attribution, it would be considered plagiarism and could result in legal action being taken against you.
In general, it is important to always give credit to the original source of any work or ideas you use, whether in an academic or non-academic setting. This not only helps you avoid plagiarism but also shows respect for the work of others and can help build a stronger culture of collaboration and creativity.
--
That all said, I did confirm that although plagiarism is not a criminal or civil offense, plagiarism is illegal if it infringes an author's intellectual property rights, including copyright or trademark.
Next ask it why it doesn't give attributions for the preexisting test it uses.
Because it doesn't make sense to. Unless you ask for an academic style output, you wouldn't expect that from any normal person.
-
-
@Obsolesce said in Staying in Ethics and Legal with ChatGPT usage?:
As an AI language model, I am a form of technology that generates text based on the input I receive from users. While I am not a human being, the text I generate is still considered someone else's work.
You could say this about Google Search or Grammerly but you don't attribute those.
Whose work is it considered to be?
I feel like this exposes the limits of AI that it doesn't understand the situation.
-
This is an important point in questioning, at a legal level, if you could claim plagerism, because ChatGPT's output cannot be copyrighted..
...
It’s unclear who can copyright or claim ownership AI-generated works. The requester, who simply used a tool to generate text, or OpenAI? Who?For a work to enjoy copyright protection under current U.S. law, “the work must be the result of original and creative authorship by a human author,” says Margaret Esquenet, partner with Finnegan, Henderson, Farabow, Garrett & Dunner, LLP. “Absent human creative input, a work is not entitled to copyright protection. As a result, the U.S. Copyright Office will not register a work that was created by an autonomous artificial intelligence tool.”
...Plagiarism and copyright are not synonymous so this is only suggestive. But the key is differentiating the output of another person and the output of a research and/or writing tool ChatGPT is like a spell checker, search engine, Grammary (combined), it is a tool and therefore not eligible for plagiarism OR copyright concerns. Defining it as "AI" is confusing, you can call anything AI that adds some degree of processing. ChatGPT is the best one publicly available so far, by far, but that's all. It's not a new tool, just way better than the old ones. We've already established that these kinds of tools are clear, there's no new discussion here that I'm aware of... only that the line between "a professor can reliably detect it" has been crossed.
-
There are good arguments against the following, but good arguments for it as well. From legal reviews..
A compelling argument, Kelber adds, “may be made that AI is simply a tool and that the human who is directing the AI should be able to claim ownership of the output. For example, a graphic artist can claim artwork made through the use of drawing software.
-
Mirriam Webster:
#########
plagiarized; plagiarizing
Synonyms of plagiarizeto steal and pass off (the ideas or words of another) as one's own - use (another's production) without crediting the source
##########
So if we look at how this wording matches the US copyright law... the AI is determined under law to not be a person. In order to plagiarize, one must take from another person, not from a tool or machine. So the copyright office has been super clear that humanity is a requirement in copyright ownership. Plagiarism, by the most common American dictionary, follows in kind. Seems black and white to me. Since the AI is the source of the text, unless it itself is plagiarizing (which it is supposed to be trained not to do), the concept of plagiarism cannot apply as it has the same burden of humanity as does copyright.
The logic that ChatGPT gives for why it suggests you would need to attribute it does not hold up to the definition and would be considered irrelevant. Which, of course, as a nascent AI, we expect a large degree of errors. And it is trained on a lot of questionable information.
So we can talk about appropriate or inappropriate uses of search engine or writing tooling, but it seems that copyright or plagiarism are clearly off of the table. Simple definition precludes them from any discussion involving AI.
-
@Obsolesce said in Staying in Ethics and Legal with ChatGPT usage?:
In any case, it is always important to use your best judgment and consult with a teacher, professor, or other authority on academic integrity if you have any doubts about whether or not your use of text generated through an AI language model could be considered plagiarism.
So basically it is saying you need to verify if your "authority" figure is being honest or just going to make up rules of their own and not abide by the English language. The question isn't whether the use is plagiarism, but whether a corrupt person will misuse the term for personal gain (e.g. professors looking for easy answers.)
As someone who has reported teaching staff for academic dishonest and been told that academic honest is only for students not for the university staff, I have little allowance for dishonest educators.
-
@scottalanmiller said in Staying in Ethics and Legal with ChatGPT usage?:
@Obsolesce said in Staying in Ethics and Legal with ChatGPT usage?:
Oh sweet, ChatGPT built into Edge now!
Literally on the phone talking about all the customers who have gotten infected by using Edge. It's the new attack vector. Most infections I've seen in a long time.
That sounds more like the kind of situation being those people would have gotten infected just the same regardless of web browser used. Latest version of the web browser prior to infection?
-
@Obsolesce said in Staying in Ethics and Legal with ChatGPT usage?:
That sounds more like the kind of situation being those people would have gotten infected just the same regardless of web browser used. Latest version of the web browser prior to infection?
As far as we can tell. It's on managed systems that are automatically updated, AV is up to date and active, firewall is on. But just takes clicking on something.
We project that Edge puts people at additional risk because it is the default product on the most insecure platform, that is also a default choice. It makes it super likely that your target is "accepting everything because it is default" rather than being thoughtful in their technology choices. It makes it an ideal public "flag" to make someone a higher than average potential for malware.
-
@scottalanmiller said in Staying in Ethics and Legal with ChatGPT usage?:
@Obsolesce said in Staying in Ethics and Legal with ChatGPT usage?:
If your English professor wants you to write an essay, and you didn't write it, then I see a problem.
Problem, yes. But not plagiarism. And not quite cheating, either. It's a weird grey area. Because it's a universal tool.
So, no ownership? It is cheating plain and simple. No different than looking over another's shoulder to pick up on what they are doing to write a test.
Have we really gone that far that "grey" justifies virtually any kind of behaviour? No culpability? No ownership? No responsibility for one's actions?
Wow Scott. That's so sad and a total antithesis to what we've taught our kids in our home school.
That's like having a bot lift weights for me then going home and telling my wife that I did the required exercise for the day.
No way. That's just bunk.
-
@PhlipElder said in Staying in Ethics and Legal with ChatGPT usage?:
It is cheating plain and simple. No different than looking over another's shoulder to pick up on what they are doing to write a test.
How do you come to that conclusion? Where does the cheating come from? From whom are you taking the content? No one. It's a tool.
From this logic, how do you allow spellcheckers, Grammarly and other forms of "cheating" on the parts that don't matter?
I would say, by definition, if you consider this cheating, you can only do so by making the project the busy work and not the output. Basically defining education as the avoidance of learning or value, rather than the increase of it.