I have been developing an IP and it was working just fine until I used it in a project.
After that two of fpga's output pins didn't work at all ! After some investigations I just realized that, in design summary these two pins reported under reg(s), valued ''off''. I think that somehow these two pins switched off.
Can anyone tell me how can I switch a pin off and on, and what makes it behave that way?
These pins are doing a simple job just turning a led on and off and they are working fine when I am testing IP, and just in the bigger project switched off.
The "OFF" in Reg (s) column doesn't mean that pin is turned off.
It means that the IOB is configured to use "Output Flip Flop" :)
Other modes might include IFF, ODDR, ENFF, etc.
Related
Is it possible to use digitalRead() function on Arduino setup to check if a circuit is opened or closed and act as a physical DEBUG flag?
Yes, you can. Do a pinmode() first - maybe you need to set the pull-up. Maybe wait a tiny bit for the levels to settle down, since you probably just turned it on. Then you can read the pin with digitalRead().
I am using a Cyclone V on a SoCKit board (link here) (provided by Terasic), connecting an HSMC-NET daughter card (link here) to it in order to create a system that can communicate using Ethernet while communication that is both transmitted and received goes through the FPGA - The problem is, I am having a really, really hard time getting this system to work using Altera's Triple Speed Ethernet core.
I am using Qsys to construct the system that contains the Triple Speed Ethernet core, instantiating it inside a VHDL wrapper that also contains an instantiation of a packet generator module, connected directly to the transmit Avalon-ST sink port of the TSE core and controlled through an Avalon-MM slave interface connected to a JTAG to Avalon Master bridge core which has it's master port exported to the VHDL wrapper as well.
Then, using System Console, I am configuring the Triple Speed Ethernet core as described in the core's user guide (link here) at section 5-26 (Register Initialization) and instruct the packet generator module (also using System Console) to start and generate Ethernet packets into the TSE core's transmit Avalon-ST sink interface ports.
Although having everything configured exactly as described in the core's user guide (linked above) I cannot get it to output anything on the MII/GMII output interfaces, neither get any of the statistics counters to increase or even change - clearly, I am doing something wrong, or missing something, but I just can't find out what exactly it is.
Can any one please, please help me with this?
Thanks ahead,
Itamar
Starting the basic checks,
Have you simulated it? It's not clear to me if you are just simulating or synthesizing.
If you haven't simulated, you really should. If it's not working in SIM, why would it ever work in real life.
Make sure you are using the QIP file to synthesize the design. It will automatically include your auto generated SDC constraints. You will still need to add your own PIN constraints, more on that later.
The TSE is fairly old and reliable, so the obvious first things to check are Clock, Reset, Power and Pins.
a.) Power is usually less of problem on devkits if you have already run the demo that came with the kit.
b.) Pins can cause a whole slew of issues if they are not mapped right on this core. I'll assume you are leveraging something from Terasic. It should define a pin for reset, input clock and signal standards. Alot of times, this goes in the .qsf file, and you also reference the QIP file (mentioned above) in here too.
c.) Clock & Reset is a more likely culprit in my mind. No activity on the interface is kind of clue. One way to check, is to route your clocks to spare pins and o-scope them and insure they are what you think they are. Similarly, if you may want to bring out your reset to a pin and check it. MAKE SURE YOU KNOW THE POLARITY and you haven't been using ~reset in some places and non-inverted reset in others.
Reconfig block. Some Altera chips and certain versions of Quartus require you to use a reconfig block to configure the XCVR. This doesn't seem like your issue to me because you say the GMII is flat lined.
The context for this question is primarily Windows 7, though I've tried it on 10 as well.
I've built a 4-player composite joystick using the Arduino Mega 2560. It is a USB device composed of 4 HID Joystick interface, each with its own endpoint. Each joystick accompanied with its buttons shows up correctly in Device Manager as a separate HID interface. They are identified by a VID/PID/MI_# triplet correctly, with MI_# being the interface index (MI_0, MI_1, etc). Calibration also sees each interface as separate, with inputs correctly corresponding to each controller in their enumerated order (ie: the first one receives inputs from only the joystick at index 0). When I dump the Descriptors, they also look correct.
There are two issues:
1) Naming
Windows only reads the interface string from the first interface. According to the Descriptor dump, each interface should have its own string, going from "Player 1" to "Player 4". Windows 7 sees them all as "Player 1". Inspecting regedit, this may because Windows 7 only has one OEM Name per joystick device, and so only gets the one for the first interface. Am I stuck with this behaviour, unless I somehow get a resolution from Microsoft?
For some reason, Windows 10 calls them all "Arduino Joystick". I'm not sure if because I'm using the same Test VID/PID combo I got from an Arduino Joystick tutorial and Windows is just picking up the name that someone else has used for their device, or if it is concatenating my Manufacturer String with the interface type "Joystick". I'm not sure why it would do the latter, but if it's the former I'd prefer to block that look-up somehow.
I'd like to resolve both, but practically speaking I'm using Windows 7 mostly.
2) Mixed Inputs
I've seen this behaviour only with some applications, but unfortunately one of them is Steam, and the others may be due to Unity. It manifests differently, so I'm lead to believe it's due to there being no standard way for dealing with composite joysticks.
On Steam, in Big Picture mode when I try to add/test a controller, while it detects all 4 controllers (all as Player 4, I might add), it only accepts the inputs from Joy4 no matter which of the controllers I choose. After saving the config however, all the joysticks have the same mappings applied. This is actually good, as I can use any controller to navigate Big Picture mode, but I'm concerned it's symptomatic of other problems which I might be seeing in other applications.
In "Race the Sun", when manually configuring joystick controls (it says Player 4 is detected), it will interpret inputs from a single joystick as coming from multiple joysticks. Usually, two of the four directional inputs come from Joy1, while the two other come from another Joystick other than the one being used. Eg: if I'm configuring Joy2, it'll register inputs from Joy1 and say Joy3.
In "Overcooked", it allows a single joystick to register as 4 different players. Normally you'd hit a particular button on the controller you want to use to register as a player, but in my case if you hit that button on joy1 4 times, then 4 players will be registered. If you start the game like this, you end-up controlling all 4 characters simultaneously with one joystick. Interesting, but not the intended usage, I'm sure.
Both "Race the Sun" and "Overcooked" are developed using Unity, and I understand that Unity's joystick management is rather lacking. Overcooked at least is designed to handle multiple players though (it's a couch co-op game), so this probably has more to do with the composite nature of my controllers.
I should note that other applications have no problems differentiating between the joysticks. Even xbox360ce sees them as separate, and the emulation works on several Steam games, single and multiplayer. Overcooked is still getting the joysticks crossed even though I'm using xbox360ce with it.
The question I'm bringing to Stack Overflow is what could I do to improve how applications handle my joysticks? Right now I'm using the generic Windows game controller driver. Theoretically this should have been enough, but issue #1 shows that composite joysticks may not be an expected use case. Would driver development even have a hope of resolving the issue with the applications I mentioned above, as I don't see how the device would differ significantly in its identification. I'm hoping someone more experienced with coding for USB devices can offer some insight.
For what it's worth, my Arduino sketch and firmware can be found here.
I have found a solution for "Race the Sun" and "Overcooked".
"Race the Sun" may have expected my joystick to provide axis ranges from 0 to a certain maximum (eg: 32767). I had the firmware set up going from -255 to +255. While other applications handle this fine, "Race the Sun" may expect the more common 0-X behaviour (this is the range that a Logitech joystick of mine provides). After changing this, I was able to configure and play it correctly. I've also updated my GitHub project; the link is in the original question.
The problem with "Overcooked" was actually caused by a badly configured or corrupted xbox360ce installation. Somewhere in tinkering with the emulator I must have screwed something up, as I messed up games that were previously working. I solved it by wiping all its files, including the content in ProgramData/X360CE, and re-downloading everything and redoing the controllers. Now all my games seem to be behaving correctly.
This still leaves the problem with Steam. For some reason Steam doesn't remember my joystick configuration from reboot to reboot. For the time-being I've decided just to put up with the default joystick behaviour, but would like to sort this one out eventually, too.
I have an HID device that is somewhat unfortunately designed (the Griffin Powermate) in that as you turn it, the input value for the "Rotation Axis" HID element doesn't change unless the speed of rotation dramatically changes or unless the direction changes. It sends many HID reports (angular resolution appears to be about 4deg, in that I get ~90 reports per revolution - not great, but whatever...), but they all report the same value (generally -1 or 1 for CCW and CW respectively -- if you turn faster, it will report -2 & 2, and so on, but you have to turn much faster. As a result of this unfortunate behavior, I'm finding this thing largely useless.
It occurred to me that I might be able to write a background userspace app that seized the physical device and presented another, virtual device with some minor additions so as to cause an input value change for every report (like a wrap-around accumulator, which the HID spec has support for -- God only knows why Griffin didn't do this themselves.)
But I'm not seeing how one would go about creating the kernel side object for the virtual device from userspace, and I'm starting to think it might not be possible. I saw this question, and its indications are not good, but it's low on details.
Alternately, if there's a way for me to spoof reports on the existing device, I suppose that would do it as well, since I could set it back to zero immediately after it reports -1 or 1.
Any ideas?
First of all, you can simulate input events via Quartz Event Services but this might not suffice for your purposes, as that's mainly designed for simulating keyboard and mouse events.
Second, the HID driver family of the IOKit framework contains a user client on the (global) IOHIDResource service, called IOHIDResourceDeviceUserClient. It appears that this can spawn IOHIDUserDevice instances on command from user space. In particular, the userspace IOKitLib contains a IOHIDUserDeviceCreate function which seems to be supposed to be able to do this. The HID family source code even comes with a little demo of this which creates a virtual keyboard of sorts. Unfortunately, although I can get this to build, it fails on the IOHIDUserDeviceCreate call. (I can see in IORegistryExplorer that the IOHIDResourceDeviceUserClient instance is never created.) I've not investigated this further due to lack of time, but it seems worth pursuing if you need its functionality.
I am new to VHDL and FPGA. I have written a sample code which does EXOR of a and b and stores it in c. This code is in VHDL behavioral architecture. I am using Quartus 11.1+SP2-2.11.
I assigned pins say a to SW0, b to SW1 and c to LEDG0. Everything is compiling and there are no errors. I go to Tools->Programmer. I have my FPGA in RUN mode. Mode in Programmer is JTAG and hence the Hardware setup is USB-Blaster [PORT 0]. When I load the .sof file and Click "Start", the progress says "failed". I do not know why.
I tried to search everywhere, but all tutorials or links give the same explanation. I guess there are hardly any who encountered this problem. I want to know if I am missing something. I want to get my fundamentals right!
Are you by any chance using Linux? If you are make sure you've done this: http://www.alterawiki.com/wiki/Quartus_for_Linux#Setup_JTAG
There can be multiple reasons as to why the loading of .sof to FPGA fails. I figured out the following for my device. If any of you are beginners, please follow the same:
1) Make sure you have the data sheet of your device with you. I followed a tutorial and entered the device number they mentioned not the one I had.
2) Check for pin assignments. This is the most important. I found out the Pins used for various switches and LEDs in a consolidated document online.
3) If it still does not work, it is best to contact experts.
Is thee FPGA an Altera DE2? If yes, you can try with this file that works with the DE2 board so that you can know if it is your .sof file that needs to be changed. If the USB blaser appears in Quartus Programmer then most likely your driver is installed correctly and you should verify whether it is your .sof file that needs to change or something else.