Should shadowing be avoided in with package names and variables? - go

There's a lot of examples in test code that are shadowing package names in test libraries like testify.
For example:
import (
"testing"
"github.com/stretchr/testify/assert"
"github.com/stretchr/testify/require"
)
func TestSomething(t *testing.T) {
assert := assert.New(t)
require := require.New(t)
var a string = "Hello"
var b string = "Hello"
assert.Equal(a, b, "The two words should be the same.")
}
Shouldn't it be an explicit decision and rare occurrence to override the value of a package namespace with a local variable?
import (
"testing"
"github.com/stretchr/testify/assert"
"github.com/stretchr/testify/require"
)
func TestSomething(t *testing.T) {
assertions := assert.New(t)
req := require.New(t)
var a string = "Hello"
var b string = "Hello"
assert.Equal(a, b, "The two words should be the same.")
}
I'd expect the proper behavior to be aliasing the package like pkgassert which I doesn't feel idiomatic, or ensuring that variables are not set to the same name as the package: assertions := assert.New(t).
Other examples:
Ok: buf := bytes.Buffer{}
Bad: bytes := bytes.Buffer{}
Note: I'm not focused on variable naming patterns, as Go tends to recommend single letter, or short variables in the narrowed scope you might use the buffer for. Assume the longer name is acceptable and desired due to the function length.
If this is a desired behavior in testing packages I'd like to know why?
It appears to violate a key principle of idiomatic Go in "no magic" and being readable.
If there are undesired side effects from this I'd like to know of that as well.
My assumption is that is means after import that any calls to the package in the same scope wouldn't work due to variable shadowing
Edit
Scope Shadowing in Go was a nice read and gave some great points on usage. The main takeaway I had from this and from Stack commenters:
It can enhance readability. (I think this is probably due to Go preferring small concise naming for packages, making package names in Go variable length.)
Parallel Test Issues was something I recall reading a while back. I didn't understand the recommendation to "hide" the loop variable by putting in:
for _, tc := range tests {
tc := tc
t.Run(tc.name, func(t *testing.T) {
t.Parallel()
// Here you test tc.value against a test function.
// Let's use t.Log as our test function :-)
t.Log(tc.value)
})
}
}
With a better understanding of shadowing, I'm seeing this "hiding" was talking about shadowing the caller scope variable of tc by assigning inside the loop, thereby shadowing.
This seems like a good use case, though I still feel it is being "clever" rather if not commented, and violates the idiomatic Go goals of being clear and readable with no "magic".

After reviewing all the feedback and writing I've seen I'm going to summarize the findings here.
If you are investigating this behavior after seeing Go linting alerts from a tool like Goland, hopefully this answer will save you same time and make clear that it's ok, idiomatic, and only in some cases of reusing the actual package calls after initializing would any issues be experienced.
Shadowing Package Names With Local Variables
This is a matter of opinion, there is no official statement indicating this is considered non-idiomatic code.
Opinions seem to overall lean towards avoiding shadowing unless intentionally done for clarity & readability, but this is again opinion, not a requirement.
Once overridden, the package would be unavailable in that scope as the package name has been shadowed by the local variable.
I've observed that this behavior seems more likely in Go than some languages, because Go prompts package names that are short, not multi-word, and easy to type. It seems inevitable that assert := assert.New(t) could result when this happens.
Consistency, as always in matters like this, is the key. If a team determines to never shadow the package name, then assertions := assert.New(t) could be used, or as mentioned the package could be aliased such as import tassert "github.com/stretchr/testify/assert". This is an opinion standard, and either way is legal and considered idiomatic.
In the case of testing, this is commonly done as further usage of the package after the initial variable set isn't used further. Example below using the is is package. Package author comment on this (again this is opinionated, but it doesn't break anything to use it with is := is.New(t).
package package_test
import (
"testing"
"package"
iz "github.com/matryer/is"
)
func TestFuncName(t *testing.T) {
is := iz.New(t)
got := FuncName()
want := ""
is.Equal(got,want) // AssertMessage
}
Shadowing Behavior Can Address Goroutine Variable Issues
Shadowing is a known behavior and documented clearly in Effective Go.
The bug is that in a Go for loop, the loop variable is reused for each iteration, so the req variable is shared across all goroutines. That's not what we want.
It may seem odd to write req := req but it's legal and idiomatic in Go to do this. You get a fresh version of the variable with the same name, deliberately shadowing the loop variable locally but unique to each goroutine.
This can be solved without shadowing by passing in the variable as a parameter into the embedded func as well, simplifying the code. Both are legal and considered idiomatic per the guide.
Shadowing for Tests
Related to this goroutine behavior, but in the context of a testing, parallel tests run in goroutines, and therefore benefit from this pattern as well.
The tc := tc statement is required in the parallel test example because the closure runs in a goroutine. More generally, the tc := tc idiom used in the context of closures where the closure can be executed after the start of the next loop iteration - credit #gopher in comments
The gist on Be Careful With Table Driven Tests mentions this behavior as well in the How To Solve This.
Detecting
The check go vet can check for shadowing, but is not considered part of the stable package, and was removed here related to GitHub Issue 29260 and GitHub Issue 34053.
The change seems to have been prompted for compatibility purposes due to the requirement of unsafeptr flag. Additional comments seem to point towards also requiring better heuristics before allowing in go vet without being experimental.
Go Vet Shadow
Instructions on running via go vet: Package Variables With Instructions:
go install golang.org/x/tools/go/analysis/passes/shadow/cmd/shadow
go vet -vettool=$(which shadow)
For those wanting to lint this, golangci-lint does have an option to configure shadowing checks in the configuration (search for check-shadowing as no bookmark to that text block`.
Further Reading
Scope and Shadowing in Go
Scopes and Closures

Wow, that is awful code. While it might not be a compiler enforced rule, if I see this:
import "github.com/stretchr/testify/assert"
Then in that file, as far as I'm concerned, any identifier spelled assert is the top level import from that package, period. Anyone who breaks that pattern, in the name of being "clever" or whatever, is just asking for trouble. People's muscle memory is trained to recognize top level imports, so please, do not do stuff like this. If you want to avoid thinking of a new name, just remove some letters:
asrt := assert.New(t)
or add a letter:
nAssert := assert.New(t)
or alias the import:
import tAssert "github.com/stretchr/testify/assert"
Just please, do not follow the poor example given in that code.

Related

Golang - Performance difference between literals and constants

I mostly use constants for documentation purposes e.g. a useful variable name or when I repeat certain sequences of strings over and over and don't want to change them manually. But I was wondering whether there's any performance difference. Am I right in my assumptions that there's no runtime difference between a literal and a constant, since constants are replaced at runtime?
Maybe I am misunderstanding, but I didn't find anything that tells me that this is wrong. The Go Tour doesn't provide any valuable information on and nor did the Constants blog post.
There's nothing that says one way or another whether even this trivial program:
package main
func main() {}
might run fast as lightning when compiled on a Tuesday, but slow as molasses when compiled on a late Friday afternoon. (Perhaps the Go compiler is anxious to head home for a beer and a weekend off and produced terrible code on Friday afternoon.1)
That said, if you're comparing, e.g.:
package main
import (
"fmt"
)
const hello = "hello"
var playground = "playground"
func main() {
fmt.Printf("%s, %s\n", hello, playground)
}
we might note that in the const variant (hello), the compiler is forced to know at compile time that the string literal "hello" is a string literal, while in the var variant (playground), the compiler could be lazy and assume that the variable playground might be modified in some other function. This in turn, combined with the ability of the compiler to know that fmt.Println is a particular function—the way GCC inserts special knowledge of the C printf function, for instance—could allow the compiler to more easily compile this to:
fmt.Printf("hello, %s\n", playground)
where only one runtime reflect happens, in case the variable playground has changed. But the existing Go compilers use SSA (see also https://golang.org/pkg/cmd/compile/internal/ssa/) and there are no writes to the variable, so we can expect simple (and usually simple = fast) runtime code here.
Playing with the Godbolt compiler site, it seems that when using const, the current compiler actually has to insert one conversion to string. The var version winds up with less runtime code. I didn't test it with string literals inserted. The %s directives are never expanded in line, but fmt.Printf really calls fmt.Fprintf directly, with os.Stdout as the first argument.
Overall, you're usually best off writing the clearest code you can. Then, if it's too slow (for whatever definition you have of "too slow"), measure. I'm guilty of overdoing my coding-time optimization myself, though. :-)
1Don't anthropomorphize computers. They hate that!

Why Does Golang Allow Compilation of Unused Functions?

Private/unexported functions not used could be detected. Why the compiler doesn't complain like it does for unused variables?
Edit: The question also applies to unused private types/interfaces too.
I believe this is a combination of scope and the default interface {}.
This is the same reason that you can declare a variable at the package level that is unused and the code will build just fine.
This snippet is perfectly valid go code:
package main
import (
"fmt"
"time"
)
var (
testVar = "sup"
)
func main() {
start := time.Now()
fmt.Println("This sure was a test")
//Mon Jan 2 15:04:05 -0700 MST 2006
fmt.Println("Finished!\nTimeElapsed:", time.Since(start))
}
Even though the variable testVar is never used.
There are several related questions here, and I think they all have the same general answer.
Why are unused variables not allowed?
Why are unused function parameters allowed if unused variables are not?
Why are unused functions/structs allowed?
...
The general answer is that unused variables in the scope of a function are ALWAYS either a waste of compiler time, or a legitimate error - so they are strictly not allowed.
However, unused function parameters, as well as private structs and functions, may satisfy an interface. At the very least they ALL satisfy the default interface {}. And as such, they are not at all guaranteed to be errors..
There doesn't appear to be any official documentation outlining the reasoning behind this particular corner of the golang philosophy, but as pointed out in the answer to a similar question you might have better luck finding answers and asking questions on the golang-nuts forum.
Hope this helps!

Do short variable declarations lead to poorly structured code in Go? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
From looking at a lot of Go code on GitHub, I have noticed that Go coders love the short variable declaration (:=)and use it very often. Here's an example Coding Style. But this usage seems far too often to create poorly structured code: Very long functions that bundle lots of functionality together, because Short variable declarations may appear only inside functions. If you want to set up a package that encapsulates something akin to a class with members that several short, modularized functions operate on, as good structured programming and OOP practices mandate, you can't really use short variable declarations effectively for member variables. I tend to get uneasy whenever I see any function that's more than 10 or 15 lines long - I know something's probably not right with that design.
Personally, I'm not a big fan of short variable declarations except for local initializion of loop counters, etc. Aside from the above mentioned issue, I like to see clearly the type I'm working with. Particularly when looking over new code, short variable declarations assume that the reader knows what the function initializing the variable is returning, or obliges them to go and find out, or deduce it from the context. So, that code becomes less readable and requires the reader to stop, and perhaps search somewhere for its meaning, whereas avardeclaration might make things immediately clear.
(I suppose one way write better code and still use short variable declarations would be to avoid the use of package global members entirely and parameterize all your functions - not necessarily a bad thing - but this probably creates more work than you will save using short variable declarations.)
As a result I have been opting to use this sort of design for my packages, similar to the way declaration and initialization work in traditional OOP languages such as Delphi and C++:
package myPackage
import (
"importA"
"importB"
)
var m_member1 *importA.T
var m_member2 *importB.T
func init() {
m_member1 = new(importA.T)
m_member2 = new(importB.T)
}
Then I have clearly typed, initialized and encapsulated package members that are available for use in the package. Yes, this does violate the good practice of initializing only when necessary, but I don't have to do this in init() either - can do it on an as needed basis, when the member is used for the first time, although that has other potential complications. (Be that as it may, since initialization of class members in a constructor has been common practice for a long time, I don't have much of problem with this, regardless.)
Is this non-idiomatic, "bad" code in Go? Is the abundant use of short variable declarations, with their IMO negative consequences, considered a good thing? Frankly I fail to see how it could be. I tend to think that perhaps short variable declarations are being used too much by programmers who just love the short syntax, and the result is a lot of bloated looking spaghetti style code. Am I missing something?
Edit: Since the question as stated caused a good deal of confusion, I'll try to illustrate with a simple example (this may or may not compile - wrote quickly just to illustrate)
If I write:
package main
import
(
"packageA"
"packageB"
)
func DoStuff(){
a:=PackageA.T
a.Dostuff()
}
Then it will be very easy to continue and write:
func doStuff(){
a:=PackageA.T
b:=PackageB.T
Dostuff(a)
DoMorestuff(b)
DoEvenMorestuff(a,b)
DoLotsofstuff(a,b)
.....
}
func main(){
DoStuff()
}
IMO bundled, bloated, poorly structured code.
______________________
But when I write
package main
import
( "packageA"
"packageB"
)
var a packageA.T
var b packageB.T
init(){
a=new(packageA.T)
b=new(packageB.T)
}
Then I can write:
func dostuff(){
a.Write()
}
func doMorestuff(){
b.Write()
}
func doEvenMorestuff(){
a.Read(b)
}
func doLotsofstuff(){
a.ReadWrite(a,b)
b.WriteRead(b,a)
}
func main(){
dostuff()
doMorestuff()
doEvenMorestuff()
doLotsofstuff()
}
A modularized pipeline style design, which cannot be implemented with the short variable declaration form. The best that can be done using the short form is nested, parameterized functions, which are generally not a very good design choice either.
Some complained that this amounts to globals, but in a well designed, encapsulated package with a minimal public interface, that is no more of an issue than declaring variables local to a function. Packages should be atomic. Member variables have been an accepted component of OOP design "forever" and when used properly, following the rules of OOP and structured programming, they are not globals, but locals to the package or module or class which encapsulates them.
Granted, there is no feature of any language that cannot be used, or abused. My question is simply that short variable declarations seem to be ripe for abuse and force certain design decisions that are less than desirable, unless used very discreetly. I'm asking if there is a way to use the form that will circumvent the issues I have with them and afford me their ease of use without the drawbacks.
Edit 2:
Perhaps this is something of a compromise:
package main
import
(
"packageA"
"packageB"
)
func dostuff(a PackageA.T){
a.Write()
}
func doMorestuff( b PackageB.T ){
b.Write()
}
func doEvenMorestuff(a a PackageA.T, b PackageB.T ){
a.Read(b)
}
func doLotsofstuff(a a PackageA.T, b PackageB.T ){
a.ReadWrite(a,b)
b.WriteRead(b,a)
}
func doStackofStuff(){
a:=PackageA.T
b:=PackageB.T
dostuff(a)
doMorestuff(b)
doEvenMorestuff(a,b)
doLotsofstuff(a,b)
}
func main(){
doStackofStuff()
}
Still bundled up in main() but that's not really a complaint - doStackofStuff() is my interface call. In "real code" I would write a separate package for all of it, and only DoStackofStuff() would be public and callable from main() - the rest would be encapsulated. The implementation is broken up in doStackofStuff(), yet using the short form without nesting.
The answer is actually very simple. The only alternative to short variable declaration e.g. a := 2 is the long variable declaration e.g. var a int = 2.
Does either of them promote spaghetti code or make functions inherently longer? No.
I think you are mixing a few issues here that are not connected:
If you need to emulate classes and structs in Go - don't use modules for them. Use structs. Build "constructors" for them. That's it. I'd hardly even call it emulation, even though it's not 100% identical to C++ or Java classes. I mean, why not just do something like
type Foo struct {
Bar string
Baz int
}
func NewFoo(bar string, baz int) *Foo {
return &Foo{
bar,
baz,
}
}
//and if you want static state - just do this
var DefaultFoo *Foo
func init() {
DefaultFoo = NewFoo("foo", 1)
}
I don't fully see why doing := inside functions will create spaghetti code. Can you make your point clearer on that? The most harm it can do is scope collisions if you're not careful - like in this example:
var x int = 3
func main() {
if x == 3 {
x := 5
fmt.Println(x) // will print 5
}
fmt.Println(x) //will print 3
}
going back to your example - it's not bad design to import types from a different module (e.g. have a static http client initiated in a module's init() function). But you have to make sure you are really not mixing up responsibility between the two packages.

How to get all defined types?

package demo
type People struct {
Name string
Age uint
}
type UserInfo struct {
Address string
Hobby []string
NickNage string
}
another package:
import "demo"
in this package, how can I get all the types exported from the demo package?
Using go/importer:
pkg, err := importer.Default().Import("time")
if err != nil {
fmt.Println("error:", err)
return
}
for _, declName := range pkg.Scope().Names() {
fmt.Println(declName)
}
(note, this returns an error on the Go Playground).
Go retains no master list of structs, interfaces, or variables at the package level, so what you ask is unfortunately impossible.
Drat, I was hoping that Jsor's answer was wrong, but I can't find any way to do it.
All is not lost though: If you have the source to 'demo', you could use the parser package to fish out the information you need. A bit of a hack though.
Do you come from some scripting language? It looks so.
Go has good reasons to propagate to let not slip 'magic' into your code.
What looks easy at the beginning (have access to all structs, register them automatically somewhere, "saving" coding) will end up in debugging and maintenance nightmare when your project gets larger.
Then you will have to document and lookup all your lousy conventions and implications.
I know what I am talking about, because I went this route several times with ruby and nodejs.
Instead, if you make everything explicit you get some feature, like renaming the People struct to let
the compiler tell you where it is used in your whole code base (and the go compiler is faster than you).
Such possibilities are invaluable for debugging, testing and refactoring.
Also it makes your code easy to reason about for your fellow coders and yourself several months after you have written it.

Built-In source code location

Where in Go's source code can I find their implementation of make.
Turns out the "code search" functionality is almost useless for such a central feature of the language, and I have no good way to determine if I should be searching for a C function, a Go function, or what.
Also in the future how do I figure out this sort of thing without resorting to asking here? (i.e.: teach me to fish)
EDIT
P.S. I already found http://golang.org/pkg/builtin/#make, but that, unlike the rest of the go packages, doesn't include a link to the source, presumably because it's somewhere deep in compiler-land.
There is no make() as such. Simply put, this is happening:
go code: make(chan int)
symbol substitution: OMAKE
symbol typechecking: OMAKECHAN
code generation: runtime·makechan
gc, which is a go flavoured C parser, parses the make call according to context (for easier type checking).
This conversion is done in cmd/compile/internal/gc/typecheck.go.
After that, depending on what symbol there is (e.g., OMAKECHAN for make(chan ...)),
the appropriate runtime call is substituted in cmd/compile/internal/gc/walk.go. In case of OMAKECHAN this would be makechan64 or makechan.
Finally, when running the code, said substituted function in pkg/runtime is called.
How do you find this
I tend to find such things mostly by imagining in which stage of the process this
particular thing may happen. In case of make, with the knowledge that there's no
definition of make in pkg/runtime (the most basic package), it has to be on compiler level
and is likely to be substituted to something else.
You then have to search the various compiler stages (gc, *g, *l) and in time you'll find
the definitions.
As a matter of fact make is a combination of different functions, implemented in Go, in the runtime.
makeslice for e.g. make([]int, 10)
makemap for e.g. make(map[string]int)
makechan for e.g. make(chan int)
The same applies for the other built-ins like append and copy.

Resources