Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
The other day I answered a question where it was necessary to loop through all the sub-dictionaries in a given directory.
As a self-taught bash user I quickly answered the question. However, in my answer I used for DIR in $(find ...); do "something"; done to loop through the sub-directories. A polite bash guru pointed out that if I continue with bad habits like that will one day find myself in a lot of trouble.
While correcting my answer I found that I had (embarrassingly) many more bad habits. I could not find any topic like this on stackoverflow, but I am sure there are many people like me at a "beginner(++)" level with bad habits based on methods that "sort-of-works".
My question is generic albeit simple:
What are important/damaging pitfalls to avoid for someone who is using bash? (If you find the question to generic to answer I would appreciate a short description of the case with link to documentation or another SO question regarding the subject is appreciated.)
E.g. When performing loops to edit files/dirs, using globb characters, pipes, logicals, short hand notation (i.e. ``) and last but not least applying functions for purposes which they should not be used.
I found Google's Styleguide very instructive. I changed some old habits after reading it, and it has a lot of examples.
EDIT: Some points to watch beneath.
Working with variables: set value without $ and access with $
Avoid confusion by using ${..} when using the variable
Calling a command within your commandline with $()
The old syntax `` will work but you can make easy mistakes.
Avoid arrays
You normally do not really need them and they have a more difficult syntax
Start (first line) your script with #!/bin/bash
It will check and use bash when available and give an error when it is missing.
Learn regex
sed can become your close friend. Even when you only use -e 's/.../.../g'
Also other utilities use regex.
Use the case statement when checking against different possible values
Avoid too much if-then-else nesting
Do not skip the if keyword when testing with [[ .. ]]
The if statement is obsolete but easier to read
Do not use eval
When you finally understand the syntax you wil learn it is dangereous.
Shell builtin's are faster but more difficult to read
basename is a simple utility but slower than ${fullfile##*/}
Maybe you should store a note with some handy builtins somewhere.
Downwards compatible?
When you think your scripts might have to be migrated to Solaris/AIX with ksh someday, try avoiding bash-specific syntax and Linux utilities. That would be quite difficult (the -i flag in sed, additional options in find) so getting bash installed on Solaris/AIX might be a better way.
Question: E.g. When performing loops to edit files/dirs, using globb characters, pipes, logicals, short hand notation (i.e. ``) and last but not least applying functions for purposes which they should not be used.
When you have a lot of scripts using the same functions, store the functions in a file like common.h
Include this file with . common.h
Learn the difference between function xxx {..} and xxx() {..}
Use local var's and return values by stdout or the returnvalue
Related
This question already has answers here:
Pretty-print for shell script
(4 answers)
Closed 7 years ago.
Talking about good coding practices. My code is getting bigger and bigger and I want to check if all my "if", and "for" loops are properly written.
I think the proper word for that is indentation (thanks #tgo).
So I have this:
if(cond1 = cond2)
if(cond3=cond4)
bla
fi
fi
but I want the following:
if(cond1 = cond2)
if(cond3=cond4)
bla
fi
fi
But for instance using Sublimetext I cannot see it like this. So repeating the question, is there any tool, software or something that can help me with this?
update: Sublime text has an option for this. (Edit-> line-> Indent) I couldn't add this to the answer.
I use vim for all my code editting (and I write a lot of bash scripts) and it has smart indenting that defaults to normal, ok stuff for all the languages I use. If you have smart indenting turned on and copy and paste code from your first block into vim (properly set up with filetype=sh), it'll turn out like your second block.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 2 years ago.
Improve this question
I know there are many linters for programming languages, like pep8 for python, but I have never come across one for a makefile. Are there any such linters for makefiles?
Any other ways to programmatically detect errors or problems in makefiles without actually running them?
As I have grown into using a makefile, it keeps on getting more complicated and long, and for me it would make sense to have a linter to keep the makefile more readable.
Things have apparently changed. I found the following:
Checkmake
Mint
Of the two, Checkmake has (as of 2018/11) more recent development, but I haven't tried either.
I also do not know where to find make file lint (web search for "make file lint" got me here), but here is a incomplete list of shallow idea fragments for implementing a make file lint utility...
White spaces are one aspect of the readability, as tabs and spaces have distinct semantics within make file. Emacs makefile-mode by default warns you of "suspicious" lines when you try to save make file with badly spaced tabs. Maybe it would be feasible to run emacs in batch mode and invoke the parsing and verification functions from that mode. If somebody were to start implementing such a make file lint utility, the emacs lisp mode could be an interesting to check.
About checking for correctness, #Mark Galeck in his answer already mentioned --warn-undefined-variables. The problem is that there is a lot of output from undefined but standardized variables. To improve on this idea, a simple wrapper could be added to filter out messages about those variables so that the real typos would be spotted. In this case make could be run with option --just-print (aka -n or --dry-run) so as to not run the actual commands to build the targets.
It is not good idea to perform any changes when make is run with --just-print option. It would be useful to grep for $(shell ...) function calls and try to do ensure nothing is changed from within them. First iteration of what we could check for: $(shell pwd) and some other common non-destructive uses are okay, anything else should invoke a warning for manual check.
We could grep for $ not followed by ( (maybe something like [$][^$(][[:space:]] expressed with POSIX regular expressions) to catch cases like $VARIABLE which parses as $(V)ARIABLE and is possibly not what the author intended and also is not good style.
The problem with make files is that they are so complex with all the nested constructs like $(shell), $(call), $(eval) and rule evaluations; the results can change from input from environment or command line or calling make invocations; also there are many implicit rules or other definitions that make any deeper semantic analysis problematic. I think all-compassing make lint utility is not feasible (except perhaps built-in within make utility itself), but certain codified guidelines and heuristic checks would already prove useful indeed.
The only thing resembling lint behaviour is the command-line option --warn-undefined-variables.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
I am teaching an upper-division software engineering course and am reviewing every student's code. Some of my students have picked up the habit elsewhere of adding a comment to the right of every closing brace identifying the statement type, such as:
if (x > 3) {
y = 10;
} //if
I have told the students to follow the Android code style guidelines, which says nothing about this practice. On what grounds should I tell them not to do this (besides personally not liking it), or should I permit it?
Comments are for clarifying code and increasing readability. It's clear enough to most reasonable software developers that the statement is an "if." Furthermore, many IDEs and editors automatically highlight brackets such as these, so the comment isn't necessary. Generally, you should save comments for describing what methods, classes and variables do (e.g. in Javadoc), or what subroutines within a method will do. This is based on the general guideline of making sure everything you add improves the code.
Tell them that they should assume that person who review code knows language syntax and how to program. Comments should be rare, indicate and explain some weird and not obvious code section (for instance the api provided by some library is bugged and some workarounds/hacks are needed). We've got documentation (and unit tests) to explain how to use and how code should behave. For educational purpose you can write small class/module filled with such "comment-documentation", give it to students and ask them what did they learn about code from these comments.
Well, most likely this will end up in a discussion based on personal preference - which is not within the scope of stackoverflow. But aI'll answer anyway:
In my opinion, that's a bad thing to do - for multiple reasons.
It messes up the code. the more comments are in there, the less readable it is. A single } in a line tells me, instantly, that the last block ends here. with the comment behind, there is more to read - and no additional info (but people will read anyway, cause they don't know that the comment doesn't include any info... and because people tend to read everything automatically)
It leads to sloppy indentation. After all, that may even be the reasons people started that in the first place.
it's unnecessary - if I indet the code in a consistent manner, it shouldn't be necessary to note what was closed, it should be easily visible by just going up to where the last statement with the same indentation level was. In most cases (unbless you're reverse-indenting (or whatever that is called), which I don't like at all) this should be very easy, as there is nothing in between...
it leads to bigger file sizes. may be invalid on modern systems, but still.
Every time is overkill. It depends on the level of indentation and the length of your function, but these are usually signs that you need to step back and refactor. The only time I explicitly do it is for namespaces in C++
This question already has answers here:
What is the difference between $(command) and `command` in shell programming?
(6 answers)
Closed 8 years ago.
What's the preferred way to do command substitution in bash?
I've always done it like this:
echo "Hello, `whoami`."
But recently, I've often seen it written like this:
echo "Hello, $(whoami)."
What's the preferred syntax, and why? Or are they pretty much interchangeable?
I tend to favor the first, simply because my text editor seems to know what it is, and does syntax highlighting appropriately.
I read here that escaped characters act a bit differently in each case, but it's not clear to me which behavior is preferable, or if it just depends on the situation.
Side question: Is it bad practice to use both forms in one script, for example when nesting command substitutions?
There are several questions/issues here, so I'll repeat each section of the poster's text, block-quoted, and followed by my response.
What's the preferred syntax, and why? Or are they pretty much interchangeable?
I would say that the $(some_command) form is preferred over the `some_command` form. The second form, using a pair of backquotes (the "`" character, also called a backtick and a grave accent), is the historical way of doing it. The first form, using dollar sign and parentheses, is a newer POSIX form, which means it's probably a more standard way of doing it. In turn, I'd think that that means it's more likely to work correctly with different shells and with different *nix implementations.
Another reason given for preferring the first (POSIX) form is that it's easier to read, especially when command substitutions are nested. Plus, with the backtick form, the backtick characters have to be backslash-escaped in the nested (inner) command substitutions.
With the POSIX form, you don't need to do that.
As far as whether they're interchangeable, well, I'd say that, in general, they are interchangeable, apart from the exceptions you mentioned for escaped characters. However, I don't know and cannot say whether all modern shells and all modern *nixes support both forms. I doubt that they do, especially older shells/older *nixes. If I were you, I wouldn't depend on interchangeability without first running a couple of quick, simple tests of each form on any shell/*nix implementations that you plan to run your finished scripts on.
I tend to favor the first, simply because my text editor seems to know what it is, and does syntax highlighting appropriately.
It's unfortunate that your editor doesn't seem to support the POSIX form; maybe you should check to see if there's an update to your editor that supports the POSIX way of doing it. Long shot maybe, but who knows? Or, maybe you should even consider trying a different editor.
GGG, what text editor are you using???
I read here that escaped characters act a bit differently in each case, but it's not clear to me which behavior is preferable, or if it just depends on the situation.
I'd say that it depends on what you're trying to accomplish; in other words, whether you're using escaped characters along with command substitution or not.
Side question: Is it bad practice to use both forms in one script, for example when nesting command substitutions?
Well, it might make the script slightly easier to READ (typographically speaking), but harder to UNDERSTAND! Someone reading your script (or YOU, reading it six months later!) would likely wonder why you didn't just stick to one form or the other--unless you put some sort of note about why you did this in the comments. Plus, mixing both forms in one script would make that script less likely to be portable: In order for the script to work properly, the shell that's executing it has to support BOTH forms, not just one form or the other.
For making a shell script understandable, I'd personally prefer sticking to one form or the other throughout any one script, unless there's a good technical reason to do otherwise. Moreover, I'd prefer the POSIX form over the older form; again, unless there's a good technical reason to do otherwise.
For more on the topic of command substitution, and the two different forms for doing it, I suggest you refer to the section on command substitution in the O'Reilly book "Classic Shell Scripting," second edition, by Robbins and Beebe. In that section, the authors state that the POSIX form for command substitution "is recommended for all new development." I have no financial interest in this book; it's just one I have (and love) on shell scripting, though it's more for intermediate or advanced shell scripting, and not really for beginning shell scripting.
-B.
You can read the differences from bash manual. At most case, they are interchangeable.
One thing to mention is that you should escape backquote to nest commands:
$ echo $(echo hello $(echo word))
hello word
$ echo `echo hello \`echo word\``
hello word
The backticks are compatible with ancient shells, and so scripts that need to be portable (such as GNU autoconf snippets) should prefer them.
The $() form is a little easier on the eyes, esp. after a few levels of escaping.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
Greetings to my revered experts!
I'm writing a command line tool(similar to ls or cat or uname etc) and I was wondering if there's a best-practice to order the arguments in the usage statement.
I'm confused because, ls's usage statement prints arguments in alphabetical order, where as cat's usage statement does not!
Try cat --help and ls --help.
Can you refer me to some documentation too, if there is a standard?
Ha, standard! No, certainly nothing like that. Pick your favorite, whichever looks nice and is organized well and is on everyone's computer already, and mimic it.
My opinions follow:
I don't see any value in alphabetical.
Order it logically, arranged into categories and put useful stuff first.
Make sure the parsing engine is solid and "standard", preferably using someone else's. Boost's Program Options or Python's optparse are good if they're in the right language for you. Most other languages have some too.
Make sure to include many examples that span the gamut of use.
No standard so to speak but they should probably be grouped by usage patterns which is how most people would use them (not alphabetically).
As with all documentation and technical writing, you first have to decide on your audience.
For example, when you want to figure out how to get sort to ignore case, you rarely know already that it's -f (fold case, who the hell thought of that?). The most useful output would have a section on data transformation options (e.g., ignore case, treat accented characters as unaccented), another on key selection (e.g., which fields or sub-fields), another of key comparisons (e.g., alpha, numeric, collation) and so on.
In any case, the sort of person who already knows it's the -f option will also know how to use less to search for that option without having to page through reams of unneeded information:-)
In fact, I'd go one better. Have two possible outputs. Make the default a usage-based format but, at the top of that, make the first usage a way of getting an alphabetical listing:
pax> paxprog --help
paxprog - truly an amazing program.
paxprog is capable of doing anything you want.
Help output options:
--help
Usage-based assistance (default).
--alpha-help
All options in alphabetical order.
Coffee-making options:
--temp=chilled|tepid|hot|damnhot
Selects the temperature.
Blah, blah, blah ...
There most certainly is a standard: http://pubs.opengroup.org/onlinepubs/9699919799/basedefs/V1_chap12.html. Note paragraph 3: "Options are usually listed in alphabetical order unless this would make the utility description more confusing. "
There's no standard, although there are some general conventions, which, I assume, you're already familiar with. It's not 100% though.