How to install CBC for Pyomo locally on Windows machine? - windows

My goal is to connect the open-source CBC solver with Pyomo in Spyder. I am working on a Windows 10 machine and it is not an option for me to use the NEOS server due to company policy.
I have downloaded the binaries from Bintray (https://bintray.com/coin-or/download/Cbc#files) that include a cbc.exe file. However when trying to run it, several errors come up stating that I am missing files (among other libbz2-1.dll and zlib1.dll). I do not know much about linux or software development but after a lot of time on google I understand that these are used for unpacking data among other things. I found all files except zlib1.dll in a developer chat on the same subject and zlib1.dll I found on another page. However when running I now get the error: “The application was unable to start correctly (0xc000007b).
I have also tried downloading MSYS2 MinGW and followed instructions from CBC. I don’t know if I require this or if it is only for developers.
Can anyone tell me what to do? I suspect other people than myself want to use CBC in Pyomo as an alternative to GLPK.

If you already have the .exe file, make sure it is in your current working folder (set as the working directory in Spyder, simply opening your file is not enough) and call it using the SolveFactory function:
opt = SolverFactory("cbc.exe")
results = opt.solve(model)
It works for me.

You will find some general information here where i outlined some approaches.
While this was targeted at Clp, it also applies to Cbc.
It's a bit strange as i observed too, that some libs are not statically linked (zlib) while it's certainly doable. But as mentioned in the thread, this should not be the case anymore (see the restriction about which files are fully statically linked) and therefore your observation is strange (and you did not say, which file you downloaded).
So i would trying one of the following (in this order):
Try again with your source, but stick to the master-versions (see first link) as the maintainer only guaranteed fully-static builds for those!
Use the builds from AMPL
(tested and works for me; generally recommended in terms of quality/stability of builds)
Use the builds from coin-or/pulp, another modelling-tool for python
(tested and works for me)
Compile from source using mingw64
(Use any build and provide some external dll of zlib and co -> hard to debug)
Of course i completely ignored other potential issues:
license-stuff (what's part of those builds)
not sure if a company can afford to use binaries not build themself in regards to legal stuff
version-compatibility with python
does every version of Cbc work
cbc version + configuration
modern version
compiled with multi-threading
...

Related

How to add a kit to Qt Creator from the command line, or other programmatic manner? [duplicate]

SO!
Let's say I have a number of settings (GCC compiler 9.3.0 built from source, as the distribution I have to use has a very old one, along with environment setup) for a new Kit in QtCreator.
I have managed to setup an environment for compilation and execution of compiled binaries, and made a script to make it work (like qmake -nocache -recursive/make/sudo make install, direct execution of g++, and other stuff).
One thing that script can't do at the moment, is that it cannot create a kit for QtCreator with new compilers and environment being set as required, so after running a script, its user has to go through setting it up himself through GUI, which is bad, because this can cause misconfiguration.
This thing I'm trying to create is going to be used by around ~200 people in my company, so leaving readme.txt with instructions just doesn't go well enough for me - I don't want running around fixing missing "{" and "}" in Environment description in created Kits, and other stuff.
Are there ways to create Kits for QtCreator automatically from command line? Maybe, there's some files to edit?
I've looked into this one a few years back (I wanted to do something similar for registering Buildroot toolchains automatically in QtCreator), and I was unable to find an off the shelf solution. So i think there are 2 ways to implement this:
a) Implementing a command line utility the manipulate the ~/.config/QtProject/qtcreator/{toolchains,profiles}.xml files. Maybe by (re)using the existing C++ implementation within QtCreator, or just re-implement it ie. in Python. Back than I didn't start to work on this as there was no real business need.
b) Switching to qbs, as qbs has support for setting up toolchains from the command line ( see: https://doc.qt.io/qbs/cli-setup-toolchains.html)
If you decide to go with solution a), please let me know and maybe we can partner up to implement it.
Check out the command line sdktool bundled with QtCreator:
The SDK tool can be used to set up Qt versions, tool chains, devices
and kits in Qt Creator.
There still is a lot of knowledge about Qt Creator internals required
to use this tool!
I haven't tried it yet, but I did find the executable under Tools/QtCreator/libexec/qtcreator subdirectory of the Qt Creator installation directory. ./sdktool --help works for me under Linux.

Opensplice failed to build dcpsisocpp2

I downloaded the latest source code of Opensplice DDS from https://github.com/ADLINK-IST/opensplice and tried to build it by following its instructions (source setenv, source ./configure, then make ..) in my Cygwin 64 bit.
The build (make command) appeared to be completed, but a number of modules such as dcpsisocpp2, durability, spliced didn't get built (I can't find dcpsisocpp2.dll, etc).
I wonder if anyone who is familar with Opensplice's makefile system can direct me to solve the problem.
You should identify you are going to use community or enterprise version.
It seems the community version doesn't have spliced and durability services. Also, dcpsisocpp2 use C++03 which is a very old C++ standard, that when you use C++11 or C++14 writing your application, you might get some warning or error and spend lots of time fixing compile problems.
Try to use dcpssacpp which follows the C++11 standard.

VST plugin doesn't get recognized on OSX

I'm just trying to get my foot inside the OSX world after recently getting a MAC.
Over the past months i haven't successfully built a working VST 2.4 yet. I simply dont get why: the projects in the vst examples work (somewhat) out of the box, but my own projects fail to work.
I've mirrored every build setting exactly (including info.plist and pkginfo), double checked that the contents of the vst.app is identical, correctly gets build as vst with correct extensions etc., and the code is virtually the same, however my vst doesn't get recognized in any of the hosts i tried.
The commandline even is identical for the build.
I've tested my VST with the included minihost vst tester, and it passes and works - but still wont be recognized. I even checked the exported symbols with nm and they look correct (ie. createEffectInstance is correctly exported).
What gives? There must be some hidden build setting somewhere that i haven't discovered that seems to disqualify my VST.
Probably the most obvious but overlooked setting: are you building as 32 or 64 bit? You need to make sure that the bitness matches your host, otherwise the plugin won't be loaded (which, btw, might explain why the plugin could load in your self-built minihost and not another sequencer). To ensure compatibility with most hosts, I'd go for 32-bit build.
Also, here's a tutorial I wrote on the subject awhile back. However, you claim that you are doing everything correct with the Info.plist and whatnot, but perhaps you missed a small step:
http://teragonaudio.com/article/Making-a-VST-plugin-from-scratch-with-Xcode.html
Another potentially useful tool is MrsWatson (disclaimer: I'm the author of that program). It's a command-line VST host which can be used to provide diagnostic information about VST's, and also is designed for plugin testing and debugging. Because of the 32/64 bit difficulties with plugins, on Mac OSX the program ships separate 32 & 64 bit binaries rather than using a universal binary.
You should try running the following command on your plugin:
mrswatson --verbose --plugin /path/to/wherever/you/put/the/plugin.vst --display-info
If you see a list of parameters and other info, then it should be kosher and able to be loaded in most sequencers. Hope this gets you on the right track!
Try using your debugger to watch what Dispatcher() calls your host makes. Check if there are any differences between your plugin and the included example projects. In my experience a host will usually abort loading a plugin immediately after a Dispatcher() call raises an exception or returns a result the host doesn't like for whatever reason.

missing zlib.dll

I am building a win32 executable. The compiler is the latest version of MinGW. The library dependencies are GLUT and libpng.
I first tested on a windows 7 machine, and had to obtain libpng3.dll and freeglut32.dll. However, on XP, I had to (in addition) acquire zlib1.dll.
The XP machine was a VM with a fresh install, so I suspect a fresh win7 machine may also be lacking zlib1.
My question is how do I go about finding out which dll's I need to distribute? How do I know, a priori, which dynamic libraries are needed for my program to run on a particular system? I suppose this is what installer programs are for... I'm guessing that what the installer does is look through the system to find out which dependencies are unsatisfied, and then provides them. So this way if I were to distribute my program I could check if the user's machine already has zlib1.dll, and I won't install zlib1.dll if it's already found in the system directory. However I never found a document that said to me specifically, "libpng requires zlib", and so, until such point as I tested the executable on a machine lacking zlib, I was unaware of this dependency. How can I create my dependency list without having a fresh install of each version of every operating system to test on?
One idea I have is to decompile the executable, or through some method examine the linking process, to find all the libraries that are being linked at runtime. The problem now becomes figuring out which of these are supposed to already be there, and which of them I could be expected to provide in the distribution.
edit: Okay, I looked, and the installation of libpng I downloaded did provide zlib1.dll inside its bin directory. So not including it is pretty much my fault. In any case, Daniel's answer is definitive.
Dependendy Walker shows all deps of your program.
The correct answer to this question, in my view, is to start at the source rather than to reverse engineer the solution with Dependency Walker, awesome and useful tool though it undoubtedly is.
The problem with Dependency Walker is that it only tells you what one particular run of the program requires on the OS on which you run it. If you have any dynamic loading dependencies in your app then you would only pick those up if you made sure you profiled the app with Dep. Walker and forced it through those dynamic loads.
My preferred approach to this problem is to start with your own source code and analyse and understand what it depends upon. It's often easy enough to do so because you know it well.
You need to understand what are the deployment requirements for your compiler. You usually have options of linking statically and dynamically to the C++ runtime. Obviously a dynamic link results in a deployment requirement.
You will also likely link to 3rd party code. One example would be Windows components. These typically don't need deployment, you can take them as already being in place. Sometimes that's not true, e.g. GDI+ on Windows 2000.
Sometimes you will link statically to 3rd party code (again easy), but if you link dynamically then that implies a deployment requirement.

How do you build the Windows D3D9 refrast from source?

Microsoft distributes source code to reference implementations of their different Direct3D APIs to hardware vendors, driver developers, etc. This code builds using the ever-cryptic WDK (formerly DDK) build system, and virtually never works out-of-the-box. Though widely used, this code is semi-private, so there is never any basic helpful information available on the wider web. This is commonly used enough (and a well-known pain in the ass in this community), and the basic build information insensitive enough, that it should be discoverable on Google.
The build readme suggests using the WDK and building with the command build -cz -daytona. This, confusingly, spits out a bunch of output yet builds nothing.
Getting past this, the Vista with WDK 6001.18002, the latest d3def9 source distribution fails in the link subproject with "failed to produce any output - warning treated as error."
The XP d3dref9.dll can also be confusing to build, frequently failing to find D3D headers and types.
There are generally two major issues quite common in building the refrast source drops as they come direct from Microsoft.
First, the build -cz -daytona command is either a typo or relies on undocumented additional external configuration. Building in this mode parses all the source, but never specifies which platform(s) to build. Since all platform dirs (daytona and win9x), where the actual outputs are specified, are "optional," nothing is ever actually built. The solution to this is to instead use the correctly-specified command build -cz daytona (no '-' on daytona). This should parse the sources and then actually build everything.
Past this, there are usually also problems with the out-of-the-box build setup.
The new WDKs (e.g. on Vista) generally fail in the final linking step with a spurious linker error. This is easily fixed by adding:
LIBRARIAN_FLAGS = $(LIBRARIAN_FLAGS) /IGNORE:4001
to the link/sources.inc build file. After this, build -cz daytona in the root of the source drop should build and link everything out-of-the-box.
On XP, it is also common to have issues if using older DDKs (pre-Windows Server 2003, i.e. "XP"-labeled DDKs). In particular, the refrast project relies on core D3D9 headers existing externally, and these are not included in the XP DDK. Simply using the latest WDKs (rebranded from "DDK" post-XP) solves this. Contrary to the naming, all newer WDKs are generally supersets of older releases, and so include build environments for platforms back through XP.
There may also be issues with some XP refrast source releases including code which triggers more pedantic compiler errors in the newer WDK compiler releases. These, however, can usually be easily fixed by iterative compilation and source tweaking in response to any simple safety/correctness errors raised by the compiler.

Resources