Skip to content

Conversation

musabgultekin
Copy link
Contributor

@musabgultekin musabgultekin commented Mar 14, 2023

I'm not an expert on Licenses BUT,

If you attribute Facebook in the README and description, you essentially admit/imply that this repo is a modification of their repo. Facebook's repo has "GPL-3.0 license". Which means this repo should also be like that in that case, which is something that we dont want.

This PR fixing that potential language issue.

Also recommend changing the description too. Such as
"Port of LLaMA model in C/C++"

@sowa705
Copy link

sowa705 commented Mar 14, 2023

This is a complete reimplementation of the GPL code. I don't think GPL applies in this case.

@G2G2G2G
Copy link

G2G2G2G commented Mar 14, 2023

https://www.gnu.org/licenses/old-licenses/gpl-2.0.html

The "Program", below, refers to any such program or work, and a "work based on the Program" means either the Program or any derivative work under copyright law: that is to say, a work containing the Program or a portion of it, either verbatim or with modifications and/or translated into another language.

@sowa705 you sure? what does facebook use?

or is this changed enough not to be considered a translation?

@sowa705
Copy link

sowa705 commented Mar 14, 2023

There is no derivative work here. Its a completely different program that happens to implement the same algorithm (in a different way)

@musabgultekin
Copy link
Contributor Author

@sowa705 Thats exactly what I wanted to clarify by this PR. I suggest changing the language that can imply this repo is a derivative work.

@@ -3,7 +3,7 @@
[![Actions Status](https://github.com/ggerganov/llama.cpp/workflows/CI/badge.svg)](https://github.com/ggerganov/llama.cpp/actions)
[![License: MIT](https://img.shields.io/badge/license-MIT-blue.svg)](https://opensource.org/licenses/MIT)

Inference of [Facebook's LLaMA](https://github.com/facebookresearch/llama) model in pure C/C++
Inference of [LLaMA](https://arxiv.org/abs/2302.13971) model in pure C/C++
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
Inference of [LLaMA](https://arxiv.org/abs/2302.13971) model in pure C/C++
Inference of [Facebook's LLaMA](https://arxiv.org/abs/2302.13971) model in pure C/C++

@ggerganov ggerganov merged commit 977295c into ggml-org:master Mar 15, 2023
@musabgultekin musabgultekin deleted the patch-1 branch March 15, 2023 20:27
blackhole89 pushed a commit that referenced this pull request Mar 15, 2023
* Update README.md

* Update README.md

remove facebook
bitRAKE pushed a commit to bitRAKE/llama.cpp that referenced this pull request Mar 17, 2023
* Update README.md

* Update README.md

remove facebook
@francis2tm
Copy link
Contributor

What about using the model? How can you be sure it's not against their terms? @musabgultekin @sowa705

@musabgultekin
Copy link
Contributor Author

I have no idea about that

rooprob pushed a commit to rooprob/llama.cpp that referenced this pull request Aug 2, 2023
Center align cute llama image in README
Deadsg pushed a commit to Deadsg/llama.cpp that referenced this pull request Dec 19, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants