I wanted to know the internal clk reference pin number to be added for arachne-pnr pcf file. I am synthesizing for Alchitry Cu Board having the iCE40 HX8k chip (supported by yosis and arachne). I couldnt understand the correlation between pin number in pcf file to pin of the chip...
If someone could give me an example pcf file for say a DFF (something using clock), I will figure out the remaining. I have searched for this in google but was unsuccessful.
I saw some reference on .gate... command but didn't understand whether it should be added to pcf file or some other file and compiled for clk output. If so, please give me that command example
Sorry, for the long question. Any help will be much appreciated.
Thanks,
Bharat
pin numbers in the PCF are the same as package pin numbers. ".gate" is in the BLIF file generated by the synthesis tool and not something you put in the PCF.
Also do note that arachne-pnr is now deprecated and largely unsupported, you should be using its successor nextpnr for open source place and route. Its handling of PCF issues should be better too.
Related
While coding, it is not always easy to have a connected instrument via GPIB.
Python offers the possibility to simulate an instrument using PyVISA-sim.
Unless I am mistaken, the community is not really active, and I got several errors:
1- PyVISA-sim gives a large panel of "virtual" instruments, but only one can effectively communicate with my code
2- While creating my own .ymal file, python fails to recognize it
I was wondering if there is an alternate way?
I will be more greedy: is there a GPIB simulator that can also simulate an instrument response?
For example: while simulating a voltmeter, the virtual instrument will effectively return me some "random" readings.
The goal is to check if my Byte to Single converter is working properly
Thanks
Short answer is that it's very much a DIY job. There are hardware instrument simulators but I'm not sure this will help you with your byte converter.
Instrument Simulator:
http://sine.ni.com/nips/cds/view/p/lang/en/nid/10763
Does the user manual not give you the format for the returned data?
I am using Xilinx ISE 14.7 synthesizer. I am able to initialize my BRAM with a .coe file and access it. Also I can update it with new .mem file using data2mem tool and update my bit file. Here I have configured it as ROM.
My problem is, I don't know how to store the BRAM contents to a file. I am using Single port block memory from the core generator. I am configuring it as RAM. I want to write data to it and access it later. I didn't find any relevant post stating this. May be its only me who didnt find a way to save the contents to a file. For example, in Altera in-system memory content editor, there is an option to export the data to a file. Is there any such way in Xilinx or some way to do it with data2mem tool?
I can send the memory content to PC using serial port but that's not my concern right now. I am really looking for some way to store the content to a file (probably a .mem file) and use it in MATLAB. Can anyone explain it to me or point a document or some link. I have studied the relevant documentation but it didn't strike my mind. Any kind of suggestion would be highly appreciated. Also, tell me if I am thinking in a wrong way.
Let me clarify, you want to initialize the BRAM with some data, then make some processing on it, then auto-magically download it to the PC and make further analysis? If I am right, you need also to clarify, if you need production solution or debug solution, for production you need to design a data-dump module and connect it to specific communication module, but for debug purposes you can do it using jtag (be aware, dumping bram content at runtime using jtag will corrupt your data! make sure that circuit is stopped and nothing is updating bram during dumping), if you have Zynq device you can try with:
https://forums.xilinx.com/t5/7-Series-FPGAs/read-bram-from-jtag-or-uart-or-zynq-PS/td-p/738600
otherwise try with readback feature of 7series fpga's
https://www.xilinx.com/support/documentation/application_notes/xapp1230-configuration-readback-capture.pdf
I want to read the "TSTR"-register (Thermal Sensor Thermometer Read Register) of my Intel Chipset.
I have found that the __readmsr function is what I need.
I have also setup a kernel driver since the function is only available in kernel mode.
But I have no clue how to access the register...
In the datasheet of the chipset it is stated on page 857 that the offset adress of the register is TBARB+03h.
How can I use this adress? Are there tutorials out there that could help me?
Thanks!
As far as I have figured out, trying to do the exact same thing, __readmsr is indeed the right command for accessing registers:
http://msdn.microsoft.com/en-us/library/y55zyfdx%28v=VS.100%29.aspx
however I am working on an i5, and Intel's documentation
http://www.intel.com/content/www/us/en/intelligent-systems/piketon/core-i7-800-i5-700-desktop-datasheet-vol-2.html
suggests that things like MC_RANK_VIRTUAL_TEMP entries are registers, so it ought to work, so you are on the right track probably.. the particular register is on page 272. So technically this is indeed the answer, __readmsr(1568) in my case.
However I am struggling to convince visual studio 2010 to build this in kernel mode, which it seems reluctant to do, I keep getting the Priviledged Instruction error.. When I get this out of the way and the whole program working I'll write a tutorial on the general process, but until then I only dare give a theoretical answer. If your compiler tends to listen to what you say, just add the /kernel compiler option, since you are only reading and not editing the registers it should be safe.
Edit:
There is also this article, the answer more or less suggests what I'm trying to do, though not much more than I have so far, but take a look anyway:
How to access CPU's heat sensors?
"This instruction must be executed at privilege level 0 or in real-address mode; otherwise, a general protection exception #GP(0) will be generated."
http://faydoc.tripod.com/cpu/rdmsr.htm
Can anyone offer any advice on options in getting real world events, ie sound, visual, motion, to trigger events on the mac?
The simplest event I think might be sound.
I simply need sound of a certain volume, to start an application on the desktop,
This application would be web based (ie javascript) or possibly standalone.
Most likely the former.
The first thing that comes to mind is Arduino but I was wondering, since I'm a total novice at Arduino coding, if there are other 3rd party apps that might make this possible.
There are a number of pre-built applications available from the arduino site here:
http://arduino.cc/en/Tutorial/HomePage
They will all invariably need some customization, but hey that's half the fun right? Here's an example that seems to do something similar to what you're describing. You would just need write an app on the computer to listen for the right serial output.
http://arduino.cc/en/Tutorial/DigitalReadSerial
OR
http://arduino.cc/en/Tutorial/AnalogReadSerial
And remember, you can always ask more questions here on SO if you run into a problem.
Yes, you can :)
1) First of all, you'll need the Arduino to "listen" to a sensor. This can be achieved by the digitalRead or analogRead methods, depending which kind of sensor you're gonna use.
2) You'll need send something to your mac when the condition you want to check happens. This involves using the Serial Port to send some kind of message from the Arduino to your mac.
3) On you're mac, you'll need to check messages on the serial port. I often use Processing to listen the serial port. You're lucky, you're on a mac, you have AppleScript :) This means Processing will have to launch a simple Applescript that will tell your selected application to open
That's it. For further details, check google to see how to send messages from arduino to processing, and how to trigger Applescript via Processing
Occasionally we get this error when communicating between my motion control software and a plasma cutting torch. What the serial link is being used for is a one time setup of cutting information before the cutting begins. I am using VB6 and MSComm for this.
I know the port itself has lunched because after it occurs other serial comm software (diagnostics, etc) can't access the port either. I would like to understand what MSCOmm is doing when it received this error so I can find a better hardware solution.
Try using a different OCX
www.comm32.com has a control that mimics mscomm but has many improvements
Mscomm32.ocx is still fully supported by Microsoft. You could ask their support people to help. If you have an MSDN subscription, you may be entitled to free support incidents.
Apologies if you'd already thought of that, I hope someone else can give you a direct solution.
I've got the same problem and that is why entered into this forum. After going deeper into the problem of communication getting lost, found the intermitence belongs to my USB-RS232 converter module. Mainly because it is working controling external hardware it gets exposed to electrical noise causing a disruption to the USB-RS232 module. It gets resolved by unplugging it and putting it back, or by turning off the whole system. Make sure your software issue is not a hardare problem