I have got a tree structure like this :
upperfolder
│
└───1
│ │ output_v1.xml
│ │ output_v2.xml
│
└───2
│ │ output_v1.xml
│
└───3
and i would like to goes to this :
upperfolder
│
└───1
│ │ output_v1.xml
│ │ output_v2.xml
│ │ output_v3.xml
│
└───2
│ │ output_v1.xml
│ │ output_v2.xml
│
└───3
│ │ output_v1.xml
I use a makefile to do this and it looks like this :
./upperfolder/%:
--- here i would like to grab the last version of each folder % , incremente it and ---
--- compute next version name file to put it in parameters of my python script ---
python3 writer.py <previous_version_path_script> $#_v?.xml
all: ./upperfolder/1/output ./upperfolder/2/output ./upperfolder/3/output ./upperfolder/4/output
Indeed my python script need 2 entries :
1) The path of previous version xml file
2) The path of new version xml file
I tried to eval variables fetching the last char of pwd of target (that is the number of folder since we can't grab the % value in target (grrrrrrrrrr)) like this :
$(eval number := $(shell basename $#))
But i'm not really satisfied because let's imagine the folder name is not a simple number but something+number, i would need to fetch the last char of it. So i'm pretty sure there exists a better way to do it. Concerning the last version of each folder, to be honest i don't have any idea how i could do it.
Help or ints would be really appreciated
You shouldn't use make to do this.
Makefiles require that you know what the final thing you want to build is. That final file is the target that make will try to build: it starts with that target and works backward find the things needed to build it. It can visualize what kinds of things you might need to build a given target, but you have to give it a target: it can't visualize forward from an existing file to some new file that it doesn't know about.
In your case you can't write down in your makefile what the final targets you want to build are, because they aren't static: the name of final target depends on the existing files.
Also, make's entire reason for existing is to avoid updating files if they already exist and are up to date. In your case you always want to create a new file, so it's not of any use to use make to try to avoid building something.
You would have to write a script that figures out what the current version is, then generates a new version from that. At that point it seems useless to even involve make: just have your script run the python command to generate the file.
In short, make is simply the wrong tool for the job you want to do.
Related
I am using swagger to develop a new api written in Go. This is my first swagger project. I installed and used this command to create my project from a swagger.yaml. I aim to make reconfigurations I put into the swagger.yaml file part of my pipeline tasks - putting a task in to execute something like swagger-codegen generate -i ./api/swagger.yaml -l go-server by strategically setting up ignores in my .swagger-codegen-ignore file. There is one thing I don't necessarily like but i can't figure out how to change. Any advice? Do i need to live with it?
the generated directory structure looks like this for go-server
.
├── api
│ └──swagger.yaml
├── go #everyting in this directory is part of the "swagger" package
│ ├── a_handler_function_file.go
│ ├── logger.go
│ ├── model_struct_file.go
│ ├── routers.go
│ └── ...
├── Dockerfile
└── main.go
I am not keen on the directory called go or the package it produces called swagger. I want something more meaningful to the project.
Does it go against conventions to rename the directory?
Is there a way to configure the swagger-codegen to rename these what I want? - I am doing research to see if there is a way but I can't find one.
It seems that SEO magic has not really crawled in a way to effectively land on this page in the swagger-codegen git repo https://github.com/swagger-api/swagger-codegen#customizing-the-generator . maybe this Q and A will help.
One can either use add a -D<configParameterName> to the generate command or one can create a config.json file and add it to the generate command using -c config.yaml.
for go-server there are only two parameters available, packageName and hideGenerationTimestamp.
So I tried swagger-codegen generate -i ./swagger.yaml -l go-server -DpackageName="myPackageName" and it worked!!!
I also tried creating a config.json file that looks like this
{
"packageName": "myPackageName"
}
and then generate command that looks like this swagger-codegen generate -i ./swagger.yaml -l go-server -c config.json
and that works too.
As far as changing the go directory - it looks like I will have to live with it
I'm a NOOB to ODL.
I downloaded ODL Magnesium and able to run karaf.
I downloaded DLUX from the mirror (https://github.com/opendaylight/dlux) and it built successfully.
I'd like to install odl-dlux-core, but the dlux/distribution-dlux folder does not exist?
Thoughts?
I manually copied the associated SNAPSHOT files to the system/org folder and was able to install dlux.
opendaylight-user#root>feature:list | grep dlux
features-dlux │ 1.0.0.SNAPSHOT │ x │ Started │ features-dlux │ features-dlux
odl-dlux-core │ 1.0.0.SNAPSHOT │ x │ Started │ odl-dlux-1.0.0-SNAPSHOT │ Opendaylight dlux minimal feature
Now, when I navigate to http://localhost:8181/index.html, I get dark gray blank page. After inspection, I'd getting a 404 for
http://localhost:8181/src/common/authentification/auth.module.js
I am in fact missing a bunch of files;
topbar.module.js
common.general.module.js
core.module.js
navigation.module.js
login.module.js
layout.module.js
I'll investigate further as to where these archives should be avail.
Any advice is appreciated.
thanks
After restarting ODL, I am now able to load the mentioned resources. However, my localhost:8181/index.html serves up a blank page and redirects to localhost:8181/index.html#/topology. I'm wondering if I need to install odl-dlux-topology?
I have multiple Go projects (and all of them are also Go modules) all in a folder. They are all HTTP servers and exchanging REST calls, thus, I need all of them up and running simultaneously.
So, for local testing purposes, I thought it would be reasonable to run all of them from the parent instead of moving all of the project root directories and running go run main.go in multiple terminals.
container_dir/
├── prj1/
│ ├── go.mod
│ ├── main.go
│ └── ...
├── prj2/
│ ├── go.mod
│ ├── main.go
│ └── ...
└── ...
Here are some some commands I have tried and the error messages for each time:
container_dir $ go run ./*/*.go
##ERROR: named files must all be in one directory; have ./prj1/ and ./prj2/
container_dir $ go run *.go
##ERROR: stat *.go: no such file or directory
container_dir $ go run ./prj1 ./prj2/
##ERROR: cannot find package "github.com/jackc/pgx/v4" in any of:
/usr/local/go/src/github.com/jackc/pgx/v4 (from $GOROOT)
/home/user/go/src/github.com/jackc/pgx/v4 (from $GOPATH)
cannot find package ...
So, I can give a final rephase for the question: How to run multiple go modules in sibling directories when they have some third party dependencies etc.?
P.S: As possible with Go modules suggest container_dir for my projects is in an arbitrary location and I expect no $GOPATH relevance.
Go version: 1.13.6
Don't use go run outside of tiny, playground-style tests
Source paths are a compile-time concern and are irrelevant at runtime - being in "sibling directories" doesn't mean anything when the program is running
go run runs a single program; just like building a program and running its executable binary (which is what go run does) runs a single program.
Looks like your go.mod stuff is having problems dude.
remember you can do replace inside it and reference your other application
module container_dir/prj2
go 1.13
require(
container_dir/prj1 v0.0.0
)
replace (
container_dir/prj1 => ../prj1
)
require is the path you import but it'll get switched to the relative path on build.
I have a repository structure as follows :-
xyz/src
1. abc
- p
- q
- r
2. def
- t
- u
- v
3. etc
- o
- m
- n
I have created a .mod file in src and run go build ./...
Except for local packages everything is fine. So if abc/p is being used in def then it throws the following exception :- cannot find module providing package abc/p. The idea behind keeping the .mod file in src package was to make sure the path is being found from where the mod file is located. Can anyone suggest where should the mod file ideally should be? also i tried placing it one directory above in xyz but still same issue as well as i created one for each sub directory. I am bit confused on this. Will I have to create separate repository for abc and etc. But considering gopath which earlier used to work for the same I think module should also be able to do the same. Any suggestions?
The most common and easiest approach is a single go.mod file in your repository, where that single go.mod file is placed in the root of your repository.
Russ Cox commented in #26664:
For all but power users, you probably want to adopt the usual convention that one repo = one module. It's important for long-term evolution of code storage options that a repo can contain multiple modules, but it's almost certainly not something you want to do by default.
The Modules wiki says:
For example, if you are creating a module for a repository
github.com/my/repo that will contain two packages with import paths
github.com/my/repo/foo and github.com/my/repo/bar, then the first
line in your go.mod file typically would declare your module path as
module github.com/my/repo, and the corresponding on-disk structure
could be:
repo/
├── go.mod <<<<< Note go.mod is located in repo root
├── bar
│ └── bar.go
└── foo
└── foo.go
In Go source code, packages are imported using the full path including
the module path. For example, if a module declared its identity in its
go.mod as module github.com/my/repo, a consumer could do:
import "example.com/my/repo/bar"
That imports package bar from the module github.com/my/repo.
I have a single go.mod in the root of my go application. I am using the following structure inspired by Kat Zien - How Do You Structure Your Go Apps
At the minute one of my applications looks like this
.
├── bin
├── cmd
│ ├── cli
│ └── server
│ └── main.go
├── pkg
│ ├── http
│ │ └── rest
| │ # app-specific directories excluded
│ └── storage
│ └── sqlite
All packages are imported via their full path, i.e. import "github.com/myusername/myapp/pkg/http/rest" otherwise it causes problems all over the place and this was the one change I had to make going from $GOPATH to go mod.
go mod then handles all the dependencies it discovers properly as far as I've discovered so far.
I was quite surprised that I couldn't find this anywhere, but anyways, I would like to know the purpose of each folder in the .gradle folder, and how safe it is to delete them, especially in terms of portability.
I know that I need the caches folder, since it contains the
downloaded dependencies.
The daemon folder seems to only contain
logs?
workers is apparently empty for me
wrapper seems irrelevant, since I don't use gradle wrapper. Why does it even download all those wrappers?
No idea about native.
Directory layout is described in "The Directories and Files Gradle Uses" chapter of its user guide.
├── caches // <1>
│ ├── 4.8 // <2>
│ ├── 4.9 // <2>
│ ├── ⋮
│ ├── jars-3 // <3>
│ └── modules-2 // <3>
├── daemon // <4>
│ ├── ⋮
│ ├── 4.8
│ └── 4.9
├── init.d // <5>
│ └── my-setup.gradle
├── wrapper
│ └── dists // <6>
│ ├── ⋮
│ ├── gradle-4.8-bin
│ ├── gradle-4.9-all
│ └── gradle-4.9-bin
└── gradle.properties // <7>
Global cache directory (for everything that's not project-specific)
Version-specific caches (e.g. to support incremental builds)
Shared caches (e.g. for artifacts of dependencies)
Registry and logs of the Gradle Daemon
Global initialization scripts
Distributions downloaded by the Gradle Wrapper
Global Gradle configuration properties
From version 4.10 onwards, Gradle automatically cleans its user home directory. The cleanup runs in the background when the Gradle daemon is stopped or shuts down. If using --no-daemon, it runs in the foreground after the build session with a visual progress indicator.
The following cleanup strategies are applied periodically (at most every 24 hours):
Version-specific caches in caches/<gradle-version>/ are checked for whether they are still in use. If not, directories for release versions are deleted after 30 days of inactivity, snapshot versions after 7 days of inactivity.
Shared caches in caches/ (e.g. jars-*) are checked for whether they are still in use. If there's no Gradle version that still uses them, they are deleted.
Files in shared caches used by the current Gradle version in caches/ (e.g. jars-3 or modules-2) are checked for when they were last accessed. Depending on whether the file can be recreated locally or would have to be downloaded from a remote repository again, it will be deleted after 7 or 30 days of not being accessed, respectively.
Gradle distributions in wrapper/dists/ are checked for whether they are still in use, i.e. whether there's a corresponding version-specific cache directory. Unused distributions are deleted.
native seem to contain platform-specific dependencies (like .so, .dll) for libraries like Jansi: it needs them to provide rich console output (like colours in the output). The code for that features is not documented, but you can take a look here. Particularly library.jansi.path system property points to ~/.gradle/native/jansi/1.17.1/linux64 (on my machine; you can check that by printing System.getProperties() in a custom Gradle task).
workers seems to be used as a working directory for the workers described in Workers API.
wrappers could be downloaded by the IDE. Basically, if you have this directory non-empty that means that you've actually used a wrapper at least once.