This is the first time I heard of cross compilation; and I tried extensively to read about it and what would be the procedure for doing the above mentioned activity.
I want to use ffmpeg onto an Xilinx zedboard which has debian in it.
Could anyone highlight the basic steps as to what do I have to do for cross compilation? (I'm using a linux machine with ubuntu OS 16.04)
NOTE: For reference I have the following questions:
1) Do I need to install Xilinx tools? (ex. Vivado)
2) Is arm.xilinx.linux.gnueabi.gcc required? If yes, How do I install
that?
3) Do I have to compile directly into the zedboard or in my machine with
ubuntu first?
4) What is this OpenCV and what is it's relationship with ffmpeg? Do I
need OpenCV?
Thanks in advance
Related
I have some binaries from c++ codes supporting Openmp under linux system. I want to run them in the WSL(Windows Subsystem for Linux) on my windows 10. I think I should install some packages to support Openmp. What do I need to install? Follow the windows openmp installation routine to install intel c++ complier? Or ubuntu routine to install a high version gcc? Maybe both? (Then my system disk may not have enough space.)
In case my above question is ill-conditioned, I illustrate my question more detailed as below. I would like to know if I have a binary supporting Openmp. Is it necessary for me to install any package to use it? Could I only set the env "OMP_NUM_THREADS" and use it directly? If I have c++ codes to compile a binary supporting Openmp. What do I need to install under WSL?
I couldn't find an easy tutorial about the openmp under WSL. Forgive my ignorance. Could you give me suggestion? Thank you.
I am trying to use Kinect (the oldest one, the very first one released by Windows) and connect it to Matlab R2013b in a Windows 10 running laptop.
I've already followed the suggested steps and installed the Matlab Support package called "Kinect for Windows Runtime". This is supposed to instal the drivers needed to run the kinect on windows too.
enter image description here
However, as I use the command "imaqhwinfo", I get a warning saying that kinect is either not supported or no powered and this output:
AdaptorDllVersion: '8.2 (R2013b)'
AdaptorName: 'kinect'
DeviceIDs: {1x0 cell}
DeviceInfo: [1x0 struct]
I'd really appreciate if anyone could tell me what the error is or, tell me the configuration you used if you were able to use matlab and kinect.
Thank you in advance
Im trying to make a video editor for android (I've never made an android app before)
After searching for libraries to use I came across FFmpeg but I'm having trouble getting it to compile on Windows 7.
I'm currently using Eclipse and have the most recent android SDK and ndk.
I've been trying to follow tutorials on the internet including roman10's (http://www.roman10.net/how-to-build-ffmpeg-with-ndk-r9/) but they all seem specific to Linux.
Yesterday, I thought I'd give up and just dual boot Ubuntu on my windows laptop but of course that was messing up too. I shrank my partition and booted Ubuntu of a USB but the installer was detecting there is no OS on my laptop and not giving me the option to install it alongside windows 7.
So, can FFmpeg be compiled for android on windows? Or is there another library I could use to make a video editor that can?
Or should I just persist with dual booting ubuntu?
Thanks
Maybe you have already know that on Windows you must use Cygwin. This is open source tools which provide functionality similar to a Linux distribution on Windows.
I work with ndk in linux. It avoids many problems and errors which i can found in Windows.
Here link how to start work with Cygwin:
http://mindtherobot.com/blog/452/android-beginners-ndk-setup-step-by-step/
I have recently purchased the ATiny84 microcontroller and I was wondering if I could upload code to it from my Macbook Pro that runs snowleopard. Specifically, could I run c files and FreeRTOS?
As H2CO3 suggest you can use avr-gcc and avrdude but I hope you're familiar with C/copiling/setting up path variables in osx.
If you're only getting started with embedded programming I recommend having a look at the Arduino project. It's very easy to get started. Also, for your particular chip(ATiny 84) have a look at this guide from MIT:
I am trying to run some of the examples with running Kinect and Processing together
http://www.shiffman.net/p5/kinect/
I have installed everything in the directions. However, when I run the processing code, it says "Kinect not found". I have done nothing except for:
1. install processing
2. install relevant libraries
3. connect kinect to computer
I have a feeling I am missing something obvious here. I am just confused because this particular library only works on Macs, but the Kinect generally works on Windows. Any guidance towards making my Mac recognize there is a kinect connected to it would be great. Thanks!
You can use a Kinect on OSX, Windows or Linux depending on the drivers/SDK (e.g. Kinect for Windows will only run on Windows 7 and up, but OpenKinect/libreenect or OpenNI will run on the above mentioned).
I'm not sure if you need to install the libusb-devel driver or not, but you should give that a go using MacPorts for example:
sudo port install git-core
sudo port install libtool
sudo port install libusb-devel
Also the Processing OpenKinect wrapper is a bit limited. You can't get accelerometer data for example. Other than the depth,RGB,IR streams with OpenNI also supplies, motor control is the only extra feature. On the other hand OpenNI also comes with hand tracking, gestures, skeleton and scene detection, etc. I recommend giving SimpleOpenNI a go.