Maintaining NuGet libraries version in various projects with shared output - visual-studio-2013

I keep two projects in each location:
Project A (C: \ GitRepository \ xxx \ yyy \ zzz \ ProjectA.sln
Project A packages (C: \ GitRepository \ xxx \ yyy \ zzz \ packages
Project B (C: \ GitRepository \ xxx \ yyy \ sss \ ProjectB.sln
Project B packages (C: \ GitRepository \ xxx \ yyy \ sss \ packages
Both projects have a common output (C: \ GitRepository \ xxx \ bin) - this is obligatory
When one team create project and add NuGet package to project (new package is downloaded by default). Then add this project to SolutionA. We get situation where AutoMapper in rev. eg. 20 in SolutionA and other rev. eg. 19 in SolutionB.
When compiling projects Jenkins MSBuild Build Project A gets the missing package rev. 20, Project B already has a package 19, but it needs newer version 20 . Error comes when read the correct version of the library.
Solutions:
One solution is to remove attributes from references specified version but you want to keep your shared libraries.
The team came up with a solution to make one such project. NuGetRepository and only through this project to add the library to the output. Other projects must host libraries builded from NuGetRepository.csproj have access through files, not project reference. We would like to keep one version in each project without conflicts.
Is solution 2 good?

Related

Service compiling successfully, but message structs not generating - gRPC/Go

I am using gRPC/protobufs as the protocol to communicate between my client and server, both written in go. I'm able to run the command show below to generate the cards.pb.go (server) and cards_grpc.pb.go (client) files without any problem. The server file is working perfectly, without any issues. The client file, however, does not seem to have access to the message items that I have defined within my cards.proto file. My services, as well as my client code, require the defined message struct in order to call the service methods, but I'm not sure what I'm missing.
Here is the command I'm running:
protoc -I="./protos" \
--go_out=plugins=grpc:./server \
--go-grpc_out=./client \
protos/*.proto
Here is my project file structure:
|-- client
|-- protos (generated protobufs for client)
|-- cards_grpc.pb.go (this compiled successfully, but structs representing my messages cannot be found)
|-- protos (This is where the proto files are defined)
|-- cards.proto
|-- server
|-- protos (generated protobufs for server)
|-- cards.pb.go (this is working perfectly, has compiled services and messages)
Note: I have defined option go_package = "./protos"; in my cards.proto file, which is why the generated files have outputted into */protos/*.pb.go locations
So you are not generating any protobuf related code for the client code here, only gRPC one. In order to generate the structure that you are looking for, use the following command:
protoc -I./protos \
--go_out=./server \
--go-grpc_out=./server \
--go_out=./client \
--go-grpc_out=./client \
protos/*.proto
The --go_out generates the go code for protobuf and the --go-grpc_out generates the go code for gRPC.
Another thing, --go_out=plugins=grpc are not supported in go anymore. You should use the --go-grpc_out.
More recommendations
I highly recommend to share the proto directory with both the client and the server (if possible), this limits the potential error due to unsynchronised Proto files.
So you would have something like:
|-- client
|-- protos
|-- cards.proto
|-- cards_grpc.pb.go
|-- cards.pb.go
|-- server
and then both access the files needed.
Second, if you are working with Go modules, I recommend that you use the go_package as following:
option go_package = "${YOUR_MODULE}/protos"
and then generate the code like this:
protoc -Iprotos \
--go_opt=module=${YOUR_MODULE} --go_out=. \
--go-grpc_opt=module=${YOUR_MODULE} --go-grpc_out=. \
protos/*.proto
Notice the . for the --go_out and --go-grpc_out. This maps the root of your project to the module name and this will generate the code inside your protos directory by removing the Module name to the go_package option. Then you will be able to access this generated code in your code like so:
import (
pb "${YOUR_MODULE}/protos"
)
Clarification
Just to be clear about the go_package, you need to understand one thing: the protobuf package and the go_package are not the same thing, the former defines package only usable in .proto files, the latter defines the package ... inside your go files. An example:
For Protobuf package
file1.proto
//...
package protos;
message Test {}
//...
file2.proto
//...
//no package definition
message Test2 {
protos.Test a_test = 1;
}
//...
For go_package
go.mod
module my_module
file1.proto (at location: ${ROOT}/protos)
//...
option go_package = "my_module/protos"
message Test {}
//...
generation
protoc -I./protos \
--go_out=./server \
--go-grpc_out=./server \
--go_out=./client \
--go-grpc_out=./client \
protos/file1.proto
main.go
package main
import (
pb "my_module/proto"
)
func main() {
var test pb.Test;
}

Import "google/api/annotations.proto" was not found or had errors. How do I add it as a dependency?

Following the docs on how to set up a gRPC gateway, I find myself stuck at step four of generating the grpc gateway.
Namely, things fall apart when the following line is added:
import "google/api/annotations.proto";
The documentation says You will need to provide the required third party protobuf files to the protoc compiler - but not actually how do do so.
How do I add google/api/annotations.proto as a dependency?
I solved it one way by adding third party google apis and its content to the root of my project.
Feels wrong, but apparently this is encouraged
I had the same issue and i resolved it following this structure :
proto
├── google
│ └── api
│ ├── annotations.proto
│ └── http.proto
└── helloworld
└── hello_world.proto
and run the command :
protoc -I ./proto \
--go_out ./proto --go_opt paths=source_relative \
--go-grpc_out ./proto --go-grpc_opt paths=source_relative \
--grpc-gateway_out ./proto --grpc-gateway_opt paths=source_relative \
./proto/helloworld/hello_world.proto
I solved it with only copying annotations.proto and http.proto
in the main proto:
import "Proto/google/api/annotations.proto";
and inside annotations.proto
import "Proto/google/api/http.proto";
and my folders look like this:
If you are using protoc to generate stubs, you need to ensure the
required dependencies are available to the compiler at compile time.
These can be found by manually cloning and copying the relevant files
from the googleapis repository, and providing them to protoc when
running. The files you will need are:
google/api/annotations.proto
google/api/field_behaviour.proto
google/api/http.proto
google/api/httpbody.proto
from grpc-gateway
for example run in project root
git submodule add https://github.com/googleapis/googleapis to get actual version
Sometimes this error occurs when we run protoc .
try to import the dependency when running the protoc command.
for this type of structure use
>protoc -I . -I pb/google/api --go_out . --go_opt paths=source_relative --go-grpc_out . --go-grpc_opt paths=source_relative --grpc-gateway_out . --grpc-gateway_opt paths=source_relative ./pb/*.proto

CMake generated project file to list external source

please consider the following structure:
dev_root/
\__ common/
\__ inc/
\__ src/
\__ CMakeLists.txt
\__ project1/
\__ inc/
\__ src/
\__ CMakeLists.txt
\__ project2/
\__ inc/
\__ src/
\__ CMakeLists.txt
Project1 and Project2 are separated projects.
They both use common code.
I want to list (selectively) some common source files into each project's visual studio project file, so that I could edit them in IDE.
I don't want a standalone library for the common code.
For now I use relative paths, for example, in project1's cmake file:
set(_hdr
inc/proj1.h
../common/inc/common1.h
)
set(_src
src/proj1.cxx
../common/src/common1a.cxx
../common/src/common1b.cxx
)
source_group("common\\inc" FILES
../common/inc/common1.h
)
source_group("common\\src" FILES
../common/src/common1a.cxx
../common/src/common1b.cxx
)
source_group("inc" FILES inc/proj1.h)
source_group("src" FILES src/proj1.cxx)
add_executable( project1 ${_src} ${_hdr} )
This is not flexible - if ever I want to move folders around, all CMakeLists.txt must be reviewed and modified.
Is there a more elegant way to de-couple the dependencies, or what is a better way to deal with common source files?
Any suggestions welcomed.
Thanks a lot.
You may set variable to the path to 'common' folder, and use it for refer to files in it. So, whenever you need to move 'common' folder around, you need to edit only this variable.
CMakeLists.txt:
set(COMMON_DIR ${CMAKE_CURRENT_SOURCE_DIR}/common)
project1/CMakeLists.txt:
set(_hdr
inc/proj1.h
${COMMON_DIR}/inc/common1.h
)
set(_src
src/proj1.cxx
${COMMON_DIR}/src/common1a.cxx
${COMMON_DIR}/src/common1b.cxx
)
...

Not able to create vaadin project using maven in eclipse

Hi I am currently building vaadin project using maven but getting below error.
Unable to create project from archetype [com.vaadin:vaadin-archetype-application:LATEST -> ]
stack trace
org.apache.maven.archetype.exception.ArchetypeNotConfigured: Archetype com.vaadin:vaadin-archetype-application:7.1.13 is not configured
Property theme is missing.
at org.apache.maven.archetype.generator.DefaultFilesetArchetypeGenerator.generateArchetype(DefaultFilesetArchetypeGenerator.java:142)
at org.apache.maven.archetype.generator.DefaultArchetypeGenerator.processFileSetArchetype(DefaultArchetypeGenerator.java:213)
at org.apache.maven.archetype.generator.DefaultArchetypeGenerator.generateArchetype(DefaultArchetypeGenerator.java:128)
at org.apache.maven.archetype.generator.DefaultArchetypeGenerator.generateArchetype(DefaultArchetypeGenerator.java:286)
at org.apache.maven.archetype.DefaultArchetype.generateProjectFromArchetype(DefaultArchetype.java:69)
at org.eclipse.m2e.core.internal.project.ProjectConfigurationManager.createArchetypeProjects0(ProjectConfigurationManager.java:761)
at org.eclipse.m2e.core.internal.project.ProjectConfigurationManager$4.call(ProjectConfigurationManager.java:710)
at org.eclipse.m2e.core.internal.project.ProjectConfigurationManager$4.call(ProjectConfigurationManager.java:1)
at org.eclipse.m2e.core.internal.embedder.MavenExecutionContext.executeBare(MavenExecutionContext.java:161)
at org.eclipse.m2e.core.internal.embedder.MavenExecutionContext.execute(MavenExecutionContext.java:137)
at org.eclipse.m2e.core.internal.embedder.MavenExecutionContext.execute(MavenExecutionContext.java:89)
at org.eclipse.m2e.core.internal.embedder.MavenImpl.execute(MavenImpl.java:1305)
at org.eclipse.m2e.core.internal.project.ProjectConfigurationManager.createArchetypeProjects(ProjectConfigurationManager.java:708)
at org.eclipse.m2e.core.ui.internal.wizards.MavenProjectWizard$5.doCreateMavenProjects(MavenProjectWizard.java:244)
at org.eclipse.m2e.core.ui.internal.wizards.AbstactCreateMavenProjectJob$1.doCreateMavenProjects(AbstactCreateMavenProjectJob.java:46)
at org.eclipse.m2e.core.ui.internal.wizards.AbstractCreateMavenProjectsOperation.run(AbstractCreateMavenProjectsOperation.java:74)
at org.eclipse.m2e.core.ui.internal.wizards.AbstactCreateMavenProjectJob.runInWorkspace(AbstactCreateMavenProjectJob.java:50)
at org.eclipse.core.internal.resources.InternalWorkspaceJob.run(InternalWorkspaceJob.java:38)
at org.eclipse.core.internal.jobs.Worker.run(Worker.java:53)
please help on this I am completely novice
Maybe try creating the project from command line. To do this follow this tutorial: https://vaadin.com/wiki/-/wiki/Main/Using+Vaadin+with+Maven
mvn archetype:generate \
-DarchetypeGroupId=com.vaadin \
-DarchetypeArtifactId=vaadin-archetype-clean \
-DarchetypeVersion=LATEST \
-DgroupId=your.company \
-DartifactId=project-name \
-Dversion=1.0 \
-Dpackaging=war

Different library paths for different build environments

I'm developing a UMDF-driver. The driver needs a different (build of a) library for 32 bit and 64 bit builds.
The TARGETLIBS property in my sources file looks like
TARGETLIBS=\
$(SDK_LIB_PATH)\strsafe.lib \
$(SDK_LIB_PATH)\kernel32.lib \
$(SDK_LIB_PATH)\ole32.lib \
$(SDK_LIB_PATH)\oleaut32.lib \
$(SDK_LIB_PATH)\uuid.lib \
...
..\otherlib\amd64\foo.lib \
but for a x86 build the path for foo.lib must be ..\otherlib\i386\foo.lib.
Obviously there is some mechanism for this in the ddk build system, since $(SDK_LIB_PATH) also points to different locations depending on the build architecture. But I'm unable to find documentation on this subject.
How do I set different library paths in one source file for different build types?
http://technet.microsoft.com/en-us/query/ff552910
Because of this convention, TARGETLIBS entries should specify library names in the following form:
<targetpath>\*\<library_name>
where targetpath is identical to the value assigned to TARGETPATH in the Sources file, and library_name is the full file name of the library to be linked to the executable file. The Build utility replaces the asterisk ( * ) with the target platform type.
That's definitely ok for my current problem. But if someone can offer more general solution i'm all ears...

Resources