Recommended file system setup for development projects? - macos

I just got a new computer (Mac, if relevant) and I'm in the process of downloading IDEs and other stuff intended for development. Is there a recommended pattern for setting up the file system for development?
On a different computer in the past, I just created a folder titled Development in the home directory and then all workspaces were dumped in there. There is a workspace folder for Eclipse projects and then some other folders for Xcode projects.
I searched and read this blog post that recommends the conventions for Go. Any other recommended setups?
I plan to contribute to open source projects and to have some Xcode and Java projects of my own, if any of that's relevant.

This might be primarily opinion based, but the folder structure I've settled on is based on my use cases. I generally have two ways I use code: experimenting with a language, and working on a project (and that project sometimes has multiple languages). Accordingly, I create two folder heirarchies: ~/Code/<language>/ for experimenting with a language, and ~/Git/<projectname>/ for projects.
The Code might look something like this:
Code/
├── Bash
│   └── tmp.sh
├── C
│   └── tmp.c
├── CPP
│   └── tmp.cpp
└── Python
├── multifile
│   ├── first.py
│   └── second.py
└── tmp.py
And the Git folder would look something like this:
Git/
├── CoolProject
└── Project1
├── README.md
├── doc
└── src
In the Code directory, I worry much less about structure or documentation. Once/If a project grows big or important enough to version control I place it in the Git directory where I try to follow the conventional folder hierarchy for the language, like your Go link, or this Python guide or whatever Eclipse would make for Java. I do try to have a README.md at the root level of each project so I'll know what it does and I can put it on GitHub easily.

Related

Is there a way to run a Go module from another directory

I have the following project structure, outside of GOPATH.
. // Project root
├── Dockerfile
├── .env
├── README.md
└── src
├── main.go
├── go.mod
├── go.sum
├── internal
│   ├── somepackage
│   │   ├── main.go
│   │   └── types.go
│   ├── someother
│   │   ├── main.go
│   │   ├── oauth.go
│   │   └── types.go
│   └── models
│   └── main.go
└── pkg
   ├── somepackage
   │   └── main.go
   └── anotherpackage
   └── main.go
I want to run my Go module code located in the src directory.
When I cd into the src directory and go run . or go build . my code, it works perfectly.
When I stand at the root of my project, I am unable to run go run ./src or go build ./src. I get the following error.
src/service.go:8:2: cannot find package "web-service/internal/auth" in any of:
/usr/lib/go/src/web-service/internal/auth (from $GOROOT)
/home/miloertas/Packages/go/src/web-service/internal/auth (from $GOPATH)
src/endpoints.go:3:8: cannot find package "web-service/internal/handlers" in any of:
/usr/lib/go/src/web-service/internal/handlers (from $GOROOT)
/home/miloertas/Packages/go/src/web-service/internal/handlers (from $GOPATH)
It's important that my source code remains in this src directory.
It is equally important that I am able to run and build my code from the root of my project (For example the .env file is located at the root of the repository).
I am therefore looking for a way to run or build my code in the src directory from the root of my project.
I tried moving the go.mod at the root of the project and running and ran go run ./src but this causes issues of its own:
The go command is now unable to locate all the sub-packages in internal and pkg
VSCode is now lost and executing tests is impossible for some reasons (Mainly because all sub-packages are not found).
Since Go 1.18, it's now possible to achieve this with Go workspaces.
Using the following directory structure
parent-dir/
└─ go.work
hello-world/
├─ go.mod
└─ main.go
You can run the hello-world module from the parent-dir using go run hello-world.
go.work
go 1.18
use ./hello-world
go.mod
module hello-world
go 1.18
Note: it is possible, not recommended as pointed out by #Volker
It's important that my source code remains in this src directory. It is equally important that I am able to run and build my code from the root of my project (For example the .env file is located at the root of the repository).
These two requirements are contradictory. You have to let go of one.
Especially the second one is unfounded: Do not use go run, use go build. Make the path to look for the .env file a command line option to your program (Go is not PHP or JavaScript, there simply is no project or source root for the executing binary). Or build the executable somewhere but execute it in you project root.
Note that having a src folder is -- to put it mildly -- uncommon.
I tried moving the go.mod at the root of the project and running and ran go run ./src but this causes issues of its own:
Well, start by not using go run at all, use go build. And then try building the actual main package. All the go tooling works best on packages, not on file system folders. If your module is named playing.hardball/for-unspecific-reasons and package main is in src try go build playing.hardball/for-unspecific-reasons/src.
Takeaways even if this doesn't work out the way you want:
Do not use go run. The reasons are manyfold, it is useful to run single file scripts and a loaded footgun for basically every other use case.
The go tool works on import paths. In simple cases the import path can be inferred from the filesystem.
A compiled executable has no notion of a "project directory", "source", "classpath" or whatever, it is a standalone executable runnable everywhere and completely detached from its sources.
Make all filesystem lookup path a configuration option (cmdline flag or environment variable); provide practical defaults (e.g. ./); use that when running your executable to announce where to find static stuff like .env files, templates, icons, css files, etc.

Error building Go project with /cmd structure (multiple entry points)

Here is the directory structure of my project (~/go/src/bitbucket.org/a/b):
├── cmd
│   ├── c
│   │   └── main.go
│   └── d
│   └── main.go
├── config
│   ├── config.go
│   ├── default.json
│   └── development.json
├── go.mod
├── go.sum
├── log
│   └── log.go
├── main.go
I need to compile 2 binaries (one for each module in cmd/).
I have tried running GO111MODULE=on go build ./cmd/c from project root (~/go/src/bitbucket.org/a/b). It silently finishes without doing anything.
I also tried running GO111MODULE=on go build -o test ./cmd/c. It created 29kb test file. When i add execution rights to it and run, it finishes with error:
./test: 2: ./test: Syntax error: newline unexpected
I have tried using go 1.12.5 and go 1.11.10.
Also when i put main.go file from any of the cmd directories to project root directory and build, the compiler builds it just fine (binary file size is ~33mb).
Is it possible to use 2 compiler entry points in a single project?
You can use go install ./... and it will create the executables in $GOPATH/bin directory if you are looking to get the executables
And regarding the multiple compiler entry points, you can. You can build using go build ./cmd/c ./cmd/d . but you cannot get the executables as per the GO Documentation -o can be only used in the case of single package. Instead you can write a makefile to get all the executables with a single make target.
And regarding the error that you are seeing, I would need more information. When I tried to build a sample application, everything works fine. I am not using GO111MODULE flag though.
go build gives me stackoverflow executable
go build -o test ./cmd/c gives me test executable
go build ./cmd/c gives me c executable
For your convenience I uploaded the project to github repo

Perl compiled by PAR::Packer doesn't run on other machines

I have an existing 2000 LOC perl script with a Tkx GUI that I just inherited on my first day as an intern at a place where I am the sole programmer (everyone else is an IC engineer, but they do their simulations in perl).
The goal is to produce an executable for clients to run without having to install perl nor anything else. Apparently this has been possible in the past.
I've only been able to get the Program to build by installing ActiveTcl 8.5.15, ActivePerl 5.16, installing via PPM Win32::API and Win32::Exe, tk, tkx, Carp, then installing via cpanp i PAR::Packer. This very specific mix is the only one I found that produced any results that worked.
Then it's:
pp -vvv -l C:\Perl\lib\auto\Tcl\tkkit.dll -l C:\Perl\lib\auto\Tcl\Tcl.dll -l C:\Perl\lib\auto\Win32\API\API.dll --gui -o .\<THE NAME OF THE FILE.EXE> .\<PERL SOURCE>.pl
From here I get a working executable, except it will not run on any of the other machines I've tested it on. It's not an arch issue as far as I can tell.
The documentation by the previous developer is extremely lacking on the subject (he's better on documenting the actual code). I'm told he migrated it to StrawberryPerl then back to ActivePerl when that broke, but as yet it's still pretty broken over here and the existing build environment is long gone.
Any help would be appreciated.
Action: Use PAR::Packer pp to compile Windows binary from a perl script so that clients can run it without perl
Expected Result: A tk GUI Window opens and stays open.
Actual Result: A tk GUI Window does not open, no errors are produced on console nor in any prompt.
Thanks to all who responded.
I actually found the solution myself, which was that this block
BEGIN { if (exists $ENV{PAR_PROGNAME}) { use Config (); $ENV{PERL_TCL_DL_PATH} = catfile($ENV{PAR_TEMP}, 'tkkit.' . $Config::Config{dlext}, ); } }
needs to appear before a "use Tkx;" to properly instruct PAR::Packer to grab the necessary parts of Tcl to package the program. So the necessary libraries were being left out and the executable was searching for ActiveTcl installations in %path% that did not exist. But thank you for the detail about the additional module for debugging.
This is my solution on Windows 7 with strawberry perl 5.26.1 and tkx 1.09:
pp --compile --noscan --dependent --compress 6 hello.pl
Copy generated .exe and needed dlls to a directory, and rename this directory to bin
Copy tcl and tk library to lib directory on the same hierarchy as bin
Now my generated file structure:
$ tree -L 2
.
├── bin
│   ├── hello.exe
│   ├── hello.pl
│   ├── libgcc_s_dw2-1.dll
│   ├── libstdc++-6.dll
│   ├── libwinpthread-1.dll
│   ├── perl526.dll
│   ├── tcl86.dll
│   ├── tk86.dll
│   └── zlib1.dll
└── lib
├── tcl8.6
└── tk8.6
BTW, the whole folder can be compressed to about 2.7MB by 7-zip on my machine.

cfx xpi command deleting compressed files in addon /lib directory?

I have a weird problem when trying to package a Firefox add-on built using version 1.9 of the SDK. The extensions directory structure is something like this:
├── data
│   ├── file1.js
│   ├── file2.js
│   ├── jquery.min.js
│   └── uri.js
├── lib
│   ├── file3.js
│   ├── main.js
│   ├── services
│   │   ├── file4.js
│   │   ├── file5.js
│   │   └── file6.js
│   └── uri.js
├── package.json
└── package.json.backup
As part of the build process, I am running the data and 'lib` directories through uglify.js. This appears to work fine. Basically I copy the codebase to a different location, run it through uglify and I get the same directory structure except the JS files are compressed.
Next, I run cfx xpi --pkgdir=path/to/ugly/codebase to package the code into an xpi.
If I then move the produced .xpi to a new directory, unzip it with unzip and inspect the contents, most of my lib directory has been deleted. Files in the data directory are fine.
tree resources/addon_name
resources/addon_name
├── data
│   ├── file1.js
│   ├── file2.js
│   ├── jquery.min.js
│   └── uri.js
└── lib
└── main.js
If I don't uglify the JS files then everything seems to work fine and when I unzip the xpi I will have a full lib directory as I would expect.
Note that this is not a problem with the uglifying process (that was the first thing I checked). When I copy the codebase and uglify it, I can stop the process at that point and list the lib directory. It will contain all the uglified JS files I would expect. It's only after packaging and subsequent unzipping that they are gone.
I have tried reproducing this issue with a brand new extension but I get a slightly different problem. Basically, files in the lib directory are deleted on packaging regardless of whether they are compressed or not. Basically my steps are:
mkdir test_extension
cd test_extension && cfx init
touch lib/uri.js // this is
cd .. && cfx xpi pkgdir=test_extension // Have to run this part twice to get ID
mkdir unpack && mv test_textension.xpi unpack
cd unpack
unzip test_extension.xpi
ls resources/test_extension/lib
=> main.js // the uri.js file is missing
If lib/uri.js is not required from any js file of your add-on, it will
be removed from the final XPI. So if you have require('./uri.js') in
your main.js, the file should be there after the packaging.
My guessing is that uglifying the libraries makes
impossible for the current cfx tool generates the proper manifest with
all dependencies. See Manifest Generation.
Note original post on mozilla-labs-jetpack mailing list, copied the answer here to be useful to someone else that doesn't know the ML.

vim/macvim plugins osx 10.7.4

Could you help me get plugins working for either macvim or the terminal vim?
What I have tried thus far:
To the desktop I git clone https://github.com/scrooloose/nerdtree.git. I have attempted placing the files in ~/.vim, where I created a folder named bundle and placed file nerdtree in there.
Alternatively I have gone to /usr/share/vim/vim73 and placed all files in the their respective folders.
Using this method does work for NERDTree in the terminal vim, however it does not for other plugins.
Still no luck with getting this to work. Help is appreciated.
NERD_tree install details:
Unzip the archive into your ~/.vim directory.
That should put NERD_tree.vim in ~/.vim/plugin and NERD_tree.txt in ~/.vim/doc.
After installing, the ~/.vim looks like this:
.vim/
├── doc
│   ├── NERD_tree.txt
│   └── tags
├── nerdtree_plugin
│   ├── exec_menuitem.vim
│   └── fs_menu.vim
├── plugin
│   └── NERD_tree.vim
└── syntax
└── nerdtree.vim
I use pathogen. This allows you to put all you plugins in a separate directory making it very easy to manage. E.g. when using nerdtree which you can clone into a folder and do git pull when there is a update.
Add this to your .vimrc
call pathogen#infect()
then, add nerdtree etc to the folder
~/.vim/bundle
that's all you need to do!
I believe this needs to be in:
~/.vim/plugin/...
From:
http://vim.runpaint.org/extending/extending-vim-with-scripts-plugins/
I think that the best solution is to do this
git clone https://github.com/scrooloose/nerdtree.git ~/.vim/
So u can be sure that the plugin go in the right folder, otherwise git make nerdtree folder for you, and you must move the nertrees' content to your ~/.vim/

Resources