I would like to use a specific version of g++ installedvat /opt/blabla/bin/g++.
How do I force premake to add initialization of CXX variable in makefile, such that it will point to the specific location?
I do realize that once makefile is generated, I can to 'make CXX=...' but I would like to have CXX set inside auto-generated makefile.
Using premake5, targeting gmake.
Thanks in advance
=============
Update:
By poking examples and browsing the code, I figured out I can do it by adding this code into premake5.lua:
local GCC_BIN_PATH = "/opt/blala/bin"
-- start: setting gcc version
-- todo: consider moving this instrumentation into a side lua script
local gcc = premake.tools.gcc
gcc.tools = {
cc = GCC_BIN_PATH.."/gcc",
cxx = GCC_BIN_PATH.."/g++",
ar = GCC_BIN_PATH.."/ar"
}
function gcc.gettoolname(cfg, tool)
return gcc.tools[tool]
end
-- finish: setting gcc version
Is there a better way to achieve the same? In particular, does it make sense to redefine gettoolname function?
Same answer as above but more generic. This file can be named "add_new_gcc_toolset.lua":
local function add_new_gcc_toolset (name, prefix)
local gcc = premake.tools.gcc
local new_toolset = {}
new_toolset.getcflags = gcc.getcflags
new_toolset.getcxxflags = gcc.getcxxflags
new_toolset.getforceincludes = gcc.getforceincludes
new_toolset.getldflags = gcc.getldflags
new_toolset.getcppflags = gcc.getcppflags
new_toolset.getdefines = gcc.getdefines
new_toolset.getincludedirs = gcc.getincludedirs
new_toolset.getLibraryDirectories = gcc.getLibraryDirectories
new_toolset.getlinks = gcc.getlinks
new_toolset.getmakesettings = gcc.getmakesettings
new_toolset.toolset_prefix = prefix
function new_toolset.gettoolname (cfg, tool)
if tool == "cc" then
name = new_toolset.toolset_prefix .. "gcc"
elseif tool == "cxx" then
name = new_toolset.toolset_prefix .. "g++"
elseif tool == "ar" then
name = new_toolset.toolset_prefix .. "ar"
end
return name
end
premake.tools[name] = new_toolset
end
return add_new_gcc_toolset
I think that official way is to create your own toolset.
The example below creates a toolset with the "arm_gcc" name:
premake.tools.arm_gcc = {}
local arm_gcc = premake.tools.arm_gcc
local gcc = premake.tools.gcc
arm_gcc.getcflags = gcc.getcflags
arm_gcc.getcxxflags = gcc.getcxxflags
arm_gcc.getforceincludes = gcc.getforceincludes
arm_gcc.getldflags = gcc.getldflags
arm_gcc.getcppflags = gcc.getcppflags
arm_gcc.getdefines = gcc.getdefines
arm_gcc.getincludedirs = gcc.getincludedirs
arm_gcc.getLibraryDirectories = gcc.getLibraryDirectories
arm_gcc.getlinks = gcc.getlinks
arm_gcc.getmakesettings = gcc.getmakesettings
function arm_gcc.gettoolname (cfg, tool)
local prefix = path.getabsolute ("../../arm_env")
prefix = prefix .. "/arm_toolchain/bin/arm-linux-gnueabihf-"
if tool == "cc" then
name = prefix .. "gcc"
elseif tool == "cxx" then
name = prefix .. "g++"
elseif tool == "ar" then
name = prefix .. "ar"
else
name = nil
end
return name
end
Then you can use:
toolset "arm_gcc"
Inside your projects, filters, etc.
This method has the advantage that you aren't overwriting the regular gcc toolset. Both can coexist if necessary.
In my case I actually find cleaner to add each compiler in its own lua file and then include it from the main (premake5.lua) script.
as a workaround, I found this method:
makesettings [[
CC = gcc
]]
https://github.com/premake/premake-core/wiki/makesettings
Related
I am using a package graphviz as part of a service and to use it I begin the bazel WORKSPACE file like this
new_local_repository(
name = "graphviz",
path = "/usr/local/Cellar/graphviz/2.49.1",
build_file_content = """
package(default_visibility = ["//visibility:public"])
cc_library(
name = "headers",
srcs = glob(["**/*.dylib"]),
hdrs = glob(["**/*.h"])
)
"""
)
...
The problem with it is its depends upon graphviz being downloaded, pre-installed and present in the path /usr/local/Cellar/graphviz/2.49.1. Is there a way to make it part of the bazel build process such that if its not present it will be fetched and put in the right place?
You can use http_archive to download one of graphviz's release archives:
https://docs.bazel.build/versions/main/repo/http.html#http_archive
From https://graphviz.org/download/source/ the 2.49.1 release is available at https://gitlab.com/api/v4/projects/4207231/packages/generic/graphviz-releases/2.49.1/graphviz-2.49.1.tar.gz
load("#bazel_tools//tools/build_defs/repo:http.bzl", "http_archive")
http_archive(
name = "graphviz",
url = "https://gitlab.com/api/v4/projects/4207231/packages/generic/graphviz-releases/2.49.1/graphviz-2.49.1.tar.gz",
strip_prefix = "graphviz-2.49.1",
sha256 = "ba1aa7a209025cb3fc5aca1f2c0114e18ea3ad29c481d75e4d445ad44e0fb0f7",
build_file_content = """
package(default_visibility = ["//visibility:public"])
cc_library(
name = "headers",
srcs = glob(["**/*.dylib"]),
hdrs = glob(["**/*.h"])
)
""",
)
To answer the "if its not present" part of the question, I'm not aware of a straight forward way to accomplish automatically switching between something that's locally installed and downloading something. http_archive will always download the archive, and new_local_repository will always use something local.
There's the --override_repository flag, which replaces a repository with a local one, e.g. --override_repository=graphviz=/usr/local/Cellar/graphviz/2.49.1 would effectively replace the http_archive with a local_repository pointing to that path. However, bazel would then expect there to be a WORKSPACE file and BUILD file already at that location (i.e., there's no way to specify build_file_content)
You could specify both repository rules in the WORKSPACE file, and then use some indirection, a Starlark flag, and a select() to switch between the repositories using a command-line flag. It's a little involved though, and also not automatic. Something like this:
WORKSPACE:
http_archive(
name = "graphviz-download",
...,
)
new_local_repository(
name = "graphviz-installed",
...,
)
load("#bazel_tools//tools/build_defs/repo:http.bzl", "http_archive")
http_archive(
name = "bazel_skylib",
urls = [
"https://github.com/bazelbuild/bazel-skylib/releases/download/1.1.1/bazel-skylib-1.1.1.tar.gz",
"https://mirror.bazel.build/github.com/bazelbuild/bazel-skylib/releases/download/1.1.1/bazel-skylib-1.1.1.tar.gz",
],
sha256 = "c6966ec828da198c5d9adbaa94c05e3a1c7f21bd012a0b29ba8ddbccb2c93b0d",
)
load("#bazel_skylib//:workspace.bzl", "bazel_skylib_workspace")
bazel_skylib_workspace()
BUILD (e.g. in //third_party/graphviz):
load("#bazel_skylib//rules:common_settings.bzl", "bool_flag")
bool_flag(
name = "use-installed-graphviz",
build_setting_default = False,
)
config_setting(
name = "installed",
flag_values = {
":use-installed-graphviz": "True",
}
)
alias(
name = "headers",
actual = select({
":installed": "#graphviz-installed//:headers",
"//conditions:default": "#graphviz-download//:headers",
})
)
Then your code depends on //third_party/graphviz:headers, by default the alias will point to the downloaded version, and the flag --//third_party/graphviz:use-installed-graphviz will switch it to the installed version:
$ bazel cquery --output build //third_party/graphviz:headers
alias(
name = "headers",
actual = "#graphviz-download//:headers",
)
$ bazel cquery --output build //third_party/graphviz:headers --//third_party/graphviz:use-installed-graphviz
alias(
name = "headers",
actual = "#graphviz-installed//:headers",
)
Another option is to write (or find) a custom repository rule that combines the functionality of http_archive and local_repository, but that would probably be a fair bit of work.
Generally I think most people just use an http_archive and download the dependencies, and if there are specific needs for being offline or caching, there's --distdir for using already-downloaded artifacts for remote repository rules: https://docs.bazel.build/versions/main/guide.html#distribution-files-directories
Edit: An example of using rules_foreign_cc with graphviz:
WORKSPACE:
load("#bazel_tools//tools/build_defs/repo:http.bzl", "http_archive")
http_archive(
name = "rules_foreign_cc",
sha256 = "69023642d5781c68911beda769f91fcbc8ca48711db935a75da7f6536b65047f",
strip_prefix = "rules_foreign_cc-0.6.0",
url = "https://github.com/bazelbuild/rules_foreign_cc/archive/0.6.0.tar.gz",
)
load("#rules_foreign_cc//foreign_cc:repositories.bzl", "rules_foreign_cc_dependencies")
# This sets up some common toolchains for building targets. For more details, please see
# https://bazelbuild.github.io/rules_foreign_cc/0.6.0/flatten.html#rules_foreign_cc_dependencies
rules_foreign_cc_dependencies()
http_archive(
name = "graphviz",
url = "https://gitlab.com/api/v4/projects/4207231/packages/generic/graphviz-releases/2.49.1/graphviz-2.49.1.tar.gz",
strip_prefix = "graphviz-2.49.1",
sha256 = "ba1aa7a209025cb3fc5aca1f2c0114e18ea3ad29c481d75e4d445ad44e0fb0f7",
build_file_content = """\
filegroup(
name = "all_srcs",
srcs = glob(["**"]),
visibility = ["//visibility:public"],
)
""",
)
BUILD:
load("#rules_foreign_cc//foreign_cc:defs.bzl", "configure_make")
# see https://bazelbuild.github.io/rules_foreign_cc/0.6.0/configure_make.html
configure_make(
name = "graphviz",
lib_source = "#graphviz//:all_srcs",
out_shared_libs = ["libcgraph.so"], # or other graphviz libs
)
cc_binary(
name = "foo",
srcs = ["foo.c"],
deps = [":graphviz"],
)
foo.c:
#include "graphviz/cgraph.h"
int main() {
Agraph_t *g;
g = agopen("G", Agdirected, NULL);
agclose(g);
return 0;
}
Usage:
$ bazel build foo
INFO: Analyzed target //:foo (0 packages loaded, 2 targets configured).
INFO: Found 1 target...
Target //:foo up-to-date:
bazel-bin/foo
INFO: Elapsed time: 0.229s, Critical Path: 0.06s
INFO: 7 processes: 5 internal, 2 linux-sandbox.
INFO: Build completed successfully, 7 total actions
I've use the below bazel rule to build static libraries with bazel:
def _cc_static_library_impl(ctx):
cc_deps = [dep[CcInfo] for dep in ctx.attr.deps]
libraries = []
for cc_dep in cc_deps:
for link_input in cc_dep.linking_context.linker_inputs.to_list():
for library in link_input.libraries:
libraries += library.pic_objects
args = ["r", ctx.outputs.out.path] + [f.path for f in libraries]
ctx.actions.run(
inputs = libraries,
outputs = [ctx.outputs.out],
executable = "/usr/bin/ar",
arguments = args,
)
return [DefaultInfo()]
cc_static_library = rule(
implementation = _cc_static_library_impl,
attrs = {
"deps": attr.label_list(providers = [CcInfo]),
},
outputs = {"out": "lib%{name}.a"},
)
How can I extract the command to use from the current toolchain instead of using the hardcoded /usr/bin/ar? I've based the rule on what I've found on the internet and I have very limited knowledge about this. This example seems to do something related:
https://github.com/bazelbuild/rules_cc/blob/main/examples/my_c_archive/my_c_archive.bzl
This is the relevant part of my_c_archive:
archiver_path = cc_common.get_tool_for_action(
feature_configuration = feature_configuration,
action_name = CPP_LINK_STATIC_LIBRARY_ACTION_NAME,
)
That gets you the path, and then you need to add cc_toolchain.all_files to your ctx.actions.run call, so it ends up looking like this:
ctx.actions.run(
inputs = depset(
direct = [libraries],
transitive = [
cc_toolchain.all_files,
],
),
outputs = [ctx.outputs.out],
executable = archiver_path,
arguments = args,
)
However, you'll also notice that my_c_archive builds up a command line and environment variables to call the archiver with. A simple toolchain won't have anything to pass in either of those, but a more complex one may not work correctly without adding them (for example, passing -m32 or setting PATH).
Part of the starlark implementation of cc_import in _create_archive_action in cc_import.bzl is a good place to start for handling all the intricacies. It creates an action to produce a static library from a set of object files, with flexibility to work with many toolchains.
I'm trying to write a custom rule for gqlgen. The idea is to run it to generate Go code from a GraphQL schema.
My intended usage is:
gqlgen(
name = "gql-gen-foo",
schemas = ["schemas/schema.graphql"],
visibility = ["//visibility:public"],
)
"name" is the name of the rule, on which I want other rules to depend; "schemas" is the set of input files.
So far I have:
load(
"#io_bazel_rules_go//go:def.bzl",
_go_context = "go_context",
_go_rule = "go_rule",
)
def _gqlgen_impl(ctx):
go = _go_context(ctx)
args = ["run github.com/99designs/gqlgen --config"] + [ctx.attr.config]
ctx.actions.run(
inputs = ctx.attr.schemas,
outputs = [ctx.actions.declare_file(ctx.attr.name)],
arguments = args,
progress_message = "Generating GraphQL models and runtime from %s" % ctx.attr.config,
executable = go.go,
)
_gqlgen = _go_rule(
implementation = _gqlgen_impl,
attrs = {
"config": attr.string(
default = "gqlgen.yml",
doc = "The gqlgen filename",
),
"schemas": attr.label_list(
allow_files = [".graphql"],
doc = "The schema file location",
),
},
executable = True,
)
def gqlgen(**kwargs):
tags = kwargs.get("tags", [])
if "manual" not in tags:
tags.append("manual")
kwargs["tags"] = tags
_gqlgen(**kwargs)
My immediate issue is that Bazel complains that the schemas are not Files:
expected type 'File' for 'inputs' element but got type 'Target' instead
What's the right approach to specify the input files?
Is this the right approach to generate a rule that executes a command?
Finally, is it okay to have the output file not exist in the filesystem, but rather be a label on which other rules can depend?
Instead of:
ctx.actions.run(
inputs = ctx.attr.schemas,
Use:
ctx.actions.run(
inputs = ctx.files.schemas,
Is this the right approach to generate a rule that executes a command?
This looks right, as long as gqlgen creates the file with the correct output name (outputs = [ctx.actions.declare_file(ctx.attr.name)]).
generated_go_file = ctx.actions.declare_file(ctx.attr.name + ".go")
# ..
ctx.actions.run(
outputs = [generated_go_file],
args = ["run", "...", "--output", generated_go_file.short_path],
# ..
)
Finally, is it okay to have the output file not exist in the filesystem, but rather be a label on which other rules can depend?
The output file needs to be created, and as long as it's returned at the end of the rule implementation in a DefaultInfo provider, other rules will be able to depend on the file label (e.g. //my/package:foo-gqlgen.go).
What is the best way to build C++ code that uses the dlib library using Bazel? I.e., what would the BUILD rules look like?
I tried following the answer for OpenCV as follows, but had no luck:
cc_library(
name = "dlib",
srcs = glob(["build/dlib/*.so*"]),
hdrs = glob(["dlib/*.h"]),
includes = ["include"],
visibility = ["//visibility:public"],
linkstatic = 1,
)
I think I figured it out. Assuming dlib was unzipped to /opt/dlib-19.2 and built in /opt/dlib-19.2/build.
In your WORKSPACE file:
new_local_repository(
name = "dlib",
path = "/opt/dlib-19.2",
build_file = "dlib.BUILD",
)
In dlib.BUILD:
cc_library(
name = "dlib",
srcs = glob(["build/dlib/*.so*"]),
hdrs = glob(["dlib/**/*.h"]),
includes = ["."],
visibility = ["//visibility:public"],
linkstatic = 1,
)
I want the function to return the "root" and the paths to all the subsystems below. I have yet no clue how to do it, this is my attempt, haven't run it yet because I need to find out all the functions to do lists and appending and getting subsystem paths and so on. But I just wanted to ask if this is the right way to do it so that i can proceed with my searching for these functions, or if there exists functions to do it or parts of it?
function dest,paths = SaveRootAndBelow(path)
entitytosave = gcs;
if strcmp(bdroot(entitytosave),entitytosave)
% I'm the main model
dest = save_system(entitytosave,path);
paths = getPaths(entitytosave)
else
% I'm a subsystem
newbd = new_system;
open_system(newbd);
% copy the subsystem
Simulink.SubSystem.copyContentsToBlockDiagram(entitytosave, newbd);
dest = save_system(newbd,path);
paths = getPaths(newbd)
% close the new model
close_system(newbd, 0);
end
end
function paths = getPaths(root)
paths = []
subsystems = find_system(modelName, 'BlockType', 'SubSystem')
foreach subsystem in subsystems
paths.append(subsyspath)
paths.append(getPaths(subsystem))
end
end