- Data Structure
- Networking
- RDBMS
- Operating System
- Java
- MS Excel
- iOS
- HTML
- CSS
- Android
- Python
- C Programming
- C++
- C#
- MongoDB
- MySQL
- Javascript
- PHP
- Physics
- Chemistry
- Biology
- Mathematics
- English
- Economics
- Psychology
- Social Studies
- Fashion Studies
- Legal Studies
- Selected Reading
- UPSC IAS Exams Notes
- Developer's Best Practices
- Questions and Answers
- Effective Resume Writing
- HR Interview Questions
- Computer Glossary
- Who is Who
The “Argument list too long” Error in Linux Commands
Overview
In this article, we will discuss the error message that is displayed when you try to execute a command on your Linux system and it says “argument list too long”. This error can be caused by many reasons. In this post, I am going to explain what causes this error and how to resolve it.
What Is The Argument List Too Long Error?
This error occurs because of an invalid argument passed to a program or shell script. It means that there are more arguments than allowed by the program. For example, if you run the following command −
ls -l /usr/bin/* | grep binutils
You may get the following error −
-bash: /usr/bin/grep: Argument list too long
If you want to know why the above command fails, then read on.
What Causes the Error?
We'll look at a situation where there are a lot of files inside a directory.
$ ls -lrt | wc -l 230086 $ ls -lrt | tail -5 -rw-r--r-- 1 shubh shubh 0 Nov 30 14:02 events2120038.log -rw-r--r-- 1 shubh shubh 0 Nov 30 14:02 events2120040.log -rw-r--r-- 1 shubh shubh 0 Nov 30 14:02 events2120039.log -rw-r--r-- 1 shubh shubh 0 Nov 30 14:02 events2120042.log -rw-r--r-- 1 shubh shubh 0 Nov 30 14:02 events2120041.log
We have over 230K log file names in our folder. Let’s see if we can get the number of files that begin with the word events.
$ ls -lrt events* | wc -l -bash: /usr/bin/ls: Argument list too long 0
Notably, the command fails, citing “Argument list too long” as the reason. Let’s try the rm command to get rid of these files −
$ rm -rf events*.log -bash: /usr/bin/rm: Argument list too long
The command again failed because of the same reason.
When performing filename expansion, Bash performs an operation called “expansion” which means that it expands the * character with each matching filename. As a result, Bash creates a very long argument string that it cannot handle.
If there are too many files to expand as command line parameters, Bash may fail to handle them. Note that this buffer (the argument buffer) is shared with the environment variable information, so the actual available memory is smaller than this buffer.
The rmdir (remove directory) commands in the previous example expand to −
$ rm -rf events2120038.log events2120040.log ... events0000001.log
Here, the argument vector gets longer than the number of lines in the file. To avoid having too many arguments, we use the getcwd() function to get the current working directory (the path from the root of the filesystem to where we're currently located). Then, we use the getargs() function to extract the first argument from each line of the file. Finally, we use the strlen() function to determine the length of the resulting string.
$ getconf ARG_MAX 2097152
The ARG_max argument determines the maximum space required by the exec() function. It allows the kernel to know the largest buffer needed to execute the program. You can verify these limits using the xargs command.
$ xargs --show-limits Your environment variables take up 2504 bytes POSIX upper limit on argument length (this system): 2092600 POSIX smallest allowable upper limit on argument length (all systems): 4096 Maximum length of command we could actually use: 2090096 Size of command buffer we are actually using: 131072 Maximum parallelism (--max-procs must be no greater): 2147483647
The most important thing to know about the upper limits on argument lengths is that they may differ from one operating systems to another.
Overcoming the Limitation
We’ll look at different ways we can approach solving this problem. All of them involve avoiding parameter expansion.
Using the find Command
You can iterate through the file names using the find command and execute them one by one using either the exec option or xargs.
$ find . -iname "events*" | xargs ls -lrt | wc -l 230085
We first get the file names containing the string "events" using the find command. Next, we pass the output of the find command to the xargs command. Finally, we run the ls and wc command over the results of the xargs command.
Using the for Loop Approach
You could use another way to iterating through the files by using a for loop.
$ for f in events*; do echo "$f"; done | wc -l 230085
This is one of several simple solutions to solving the problem. It may take longer than others, but it works well.
Manual Split
We can split the file into smaller chunks and run the commands repeatedly with different sets of strings as arguments each iteration.
$ ls -lrt events1*.log | wc -l 31154 $ ls -lrt events2*.log | wc -l 15941
We're filtering only the files whose name starts with "events1". In this particular case, we stay within the limits set by the ARG_MAX variable.
We then repeat the same steps for those starting with “event1”, “event2”, etc.
When We Just Need to Remove the Content of a Directory
If we're trying to remove all files from a directory, then consider a scenario where we fail at doing so −
$ rm -rf * -bash: /usr/bin/rm: Argument list too long
If we want to solve this problem, we could instead just remove the directory and recreate it.
$ rm -rf /home/shubh/tempdir/logs_archive $ cd home/shubh/tempdir && mkdir logs_archive
Here, the logs_archive folder contains the files we want to delete.
Since we're deleting the folder and recreating it, this approach won't keep the original permission settings or ownership of the folder.
Conclusion
We looked at several different ways to deal with the "too long" problem.
We first discussed what causes this issue and then looked at some ways to fix it.