0

I have a large directory structure with many files named "configuration.txt". For each instance of configuration.txt that has directory "n10" somewhere in its path (and there are many such instances of this particular directory), I would like to do a search-and-replace where all instances of the string "DNSMax=20" gets replaced with string "DNSMax=50".

Please note that my path names contain spaces.

Could someone please give a Bash shell script that, if invoked from the root of my large directory structure, would accomplish this task?

I am using RedHat Linux.

Thank you!

3
  • It wasn't my downvote, but I'm tempted. You should know better than to ask a question without showing what you've tried. What have you thought of doing? Which command finds files that meet certain criteria? Which editing tools have you considered? Commented Mar 27, 2014 at 14:14
  • I have tried various combinations of find, grep, sed, xargs, and backquoting of commands to accomplish this task. It was several days ago that I made these attempts and I do not recall all of the combinations I tried and the corresponding results. Regardless of what I discovered does not work, someone out there in the community knows how to do this task with ease. Commented Mar 27, 2014 at 14:19
  • It sounds like you need a configuration management system. Commented Mar 27, 2014 at 14:52

2 Answers 2

3

Using find and GNU sed:

find / -path '*/n10/*' -name configuration.txt \
      -exec sed -i 's/DNSMax=20/DNSMax=50/' {} \;
Sign up to request clarification or add additional context in comments.

6 Comments

This is the part where I'm very, very embarrassed. +1.
...that said, this should use a disclaimer that it requires GNU find and sed, since -i is nonstandard for sed, as is -path for find.
But since the system is explicitly RedHat Linux, GNU find and sed are likely to be the resident commands — though I agree the caveat should be stated in case someone tries it on, say, AIX or Solaris and it does not work. The \; should be replaced by + to achieve the xargs effect (one sed command for many files).
@JonathanLeffler, I agree, but then, we're answering not just for the OP of the question, but anyone who ends up on this page with a similar problem as well; not everyone there will be in the same environment.
If the sed doesn't work, then Perl can be used, as in Vijay's answer. However, there's a reasonable chance that if you don't have GNU sed, you don't have GNU find, and then you are left with a less tractable problem of filtering the pathnames. I'd probably go with find / -name configuration.txt -exec custom-script.sh {} + (assuming POSIX 2008 find) where the script contains: for file in "$@"; do case "$file" in */n10/*) perl -pi -e 's/DNSMax=20\b/DNSMax=50/' "$file";; esac; done.
|
0

This will work provided there are no spaces in between the full file path.

find . -name "configuration.txt"|grep '\/n10\/'|xargs perl -pi -e 's/DNSMax=20/DNSMax=50/'

6 Comments

Why the backslashes in the grep? What about directory paths with spaces in the name?
Hmm. Use of perl is an improvement on my answer, use of NUL-delimited filenames is a step back (since valid UNIX filenames can contain `$'\n').
@JonathanLeffler, I'm not sure spaces are a problem. Certainly, though, newlines are.
@CharlesDuffy: spaces are a problem too because xargs splits on white space, not just on newlines. (Historical design — not a good choice, really, but it's 30 years too late to change it.) I have my own program, xargl, which works one file per line. I don't actually use it though.
Ahh. I'd forgotten. Yup, that's a problem -- and a big one, given the potential for malicious filenames such as "here is/ /etc/passwd /n10/configuration.txt". Can't advise use of this solution, given same.
|

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.