Last Updated on March 12, 2024

Everything has its limits. Certainly, some astronomers, philosophers, or theologians will not agree with me, but in the IT world, this sentence is perfectly correct. What’s more, storage limits are one of the big problems of this industry. And what does it look like in the most popular hosting service? What is the GitHub storage limit? What is the GitHub max file size? Let me use my favorite answer – it depends.

GitHub limits

By default, if the file is larger than 50MB, you’ll get a warning that you may be doing something wrong, but it will still be possible to upload the file. Only the 100 MB threshold is blocked and this is the GitHub file size limit. If you are uploading via browser, the limit is even lower – the file can be no larger than 25 MB.

Of course, these are the default settings, but you can extend these limits and add larger files to the repo. For this purpose, what we call LFS (Large File Storage) was created. It is a feature that allows us to throw much larger files into the repository, and the limits this time depend on our account type.

Source: GitHub

Let’s leave the sizes of individual files for a moment and check the limits related to the entire repository. How big is the GitHub repository size limit? Well, there is no unequivocal answer or hard threshold here. The recommended and optional repository size is less than 1 GB, while less than 5 GB is “strongly recommended”. However, this is not a hard limit and theoretically, we can exceed it. However, then we can expect contact from support to clarify whether we really need such a large space.

Let us stop here for a moment. If you are reading carefully, the alert now comes on. Why, on the one hand, GitHub recommends the size of the entire repository as less than 1 GB, but on the other hand, the aforementioned LFS allows you to upload a file with a size of 2 GB for the GitHub Free account? Sit back, everything will be clear in a moment.

Get free trial

GitHub Large File Storage

We already know the individual limits, but how does this LFS mechanism work? Well, there’s a little ‘cheat’ here because these big files aren’t really stored in our repository! Let me quote official documentation:

Git LFS handles large files by storing references to the file in the repository, but not the actual file itself. To work around Git’s architecture, Git LFS creates a pointer file which acts as a reference to the actual file (which is stored somewhere else). GitHub manages this pointer file in your repository. When you clone the repository down, GitHub uses the pointer file as a map to go and find the large file for you.”

Another tricky part of LFS is that by default, it doesn’t matter if you have a paid subscription or not, there are limits. 1 GB of free storage and 1 GB a month of free bandwidth. What does it mean? Every push of a big file will consume the storage limit. If you push a 500 MB file twice, you will use all of the free storage limit. The Bandwidth is used when the user (or GitHub actions) downloads this file. The limit is used upon every download, so it can run out pretty quickly. Of course, it is possible to purchase larger limits.

Why is my repository so big?

The best answer to the question “how to upload large files to GitHub” is – do not upload large files to GitHub. Simple as that. A common problem is treating the repository as a bottomless bag into which we can put everything. More than once I have encountered a situation when log files or compiled classes or other – equally unnecessary – binary files were added to the repo. Sometimes we can also find some external libraries, which is pointless, and sometimes maybe even illegal! Be aware of that. There is no clear-cut solution here because the problem depends on the specific situation in a given repo, but the official documentation may be helpful.


Ready to overcome GitHub file size limits? Do the next best thing and secure your code with the first professional GitHub backup.


The situation is different when it comes to graphic files. It is often a necessary item that simply needs to be in the repository. But here, too, we can optimize by applying compression. And we don’t have to do it ourselves as there are ready-made apps in GitHub Marketplace that will do it for us. For example, I recommend Imgbot which compresses the image files without any loss in quality.

Large files vs. backup solutions

That should be our mantra – make a backup before removing anything from the remote repository. Make a backup before making any changes to the repo configuration. And repeat that over and over again. But here, too, the topic of the size of the repository is of great importance. A “light” repo allows us to operate more efficiently and faster. It also facilitates backup and recovery of the repository after a failure. Smaller repo is beneficial in any situation.

Also, before we start cleaning up our repo, or changing its configuration, it would be ideal to first make a proper backup. For example, GitProtect.io allows you to backup not only GitHub repositories and metadata, but also GitHub LFS, making all your critical assets secure. What’s more, during the restore process you can easily enable or disable LFS restore. I this case, if an event of disaster takes place, you can decide on whether it’s necessary to restore your LFS immediately for workflow continuity or you can wait and restore them later giving priority to other metadata for your team’s continuous coding.

Before you go:

✍️ Subscribe to GitProtect DevSecOps X-Ray Newsletter and stay informed about the latest developments in DevSecOps

🔎 Find out what duties regarding the security you have within the GitHub Shared Responsibility Model

📚 Learn about the importance of data retention policies in DevOps backup and recovery

🔎 Discover which GitHub security best practices to follow to ensure that your repositories and metadata are safe

📅 Schedule a live custom demo and learn more about GitProtect backups for your GitHub repositories and metadata

📌 Or try GitProtect backups for the GitHub environment and start protecting your critical GitHub data in a click

Comments are closed.

You may also like