Can the NFC hardware within Lumia 920 emulate a 125 kHz Proximity Access Card?
It looks like the NFC hardware implements the standard that is a superset of the standard that access cards use. But I don't have enough knowledge of those radio standards to understand if a phone can work only as a receiver or also as a transmitter of such signals.
I will also appreciate a link to a good overview article that explains those standards in simpler terms than the official specifications.
Currently, card emulation is not supported by the Windows Phone proximity API.
Sources:
NFC, card emulation
NFC Developer Comparison
It is impossible from the physics perspective and standards. NFC is based upon ISO 14443 and is using 13,56 MHz wave carrier - which is high-frequency layer - it is typically the electric field scope.
125 kHz is the proximity scope (125 kHz - 134 kHz) which uses magnetic induction (and magnetic field) as the communication environment.
Conclusion - it is physically impossible in this exact case. You could if you would use 13,56-solutions as proximity (this is possible).
Take a look here:
This white paper details the features in Windows Phone 8.1 that you can use to create a UICC-based NFC card-emulation app.
http://www.microsoft.com/en-us/download/details.aspx?id=43681
Related
I’m pretty new to coding with vhdl and i just finished making a simple game using a pretty rough vga driver that i made. The last thing now that i need to do is hook up a joystick to be able to control the object in the game( this game is a mini project so i have to present it and using the onboard switches wouldn’t cut it). The problem is that the joystick gives an analog input and i don’t know how to include that in my vhdl program or if its even possible. I’m using a de-10 lite board. I’m sorry if my question is messy and i hope I made it clear for you. Thx in advance.
DE10-Lite is built with MAX 10 fpga which has two on-chip ADCs, and the board has analog buffers to scale 5v analog inputs down to acceptable voltage of 2.5v.
You'll need to instantiate "Modular ADC core" and PLL to clock it.
Depending on your project needs you can instantiate just the ADC control core (it has simple streaming interface), or "standard sequencer with avalon-mm sample storage".
Check with the board's manuals to find which pins are connected to banks with ADC.
Apparently, there's an example project for ADC included with "CD-ROM" that you can download from Terasic site.
Is there an option for time-synchronization e.g. IEEE AVB/TSN of wireless nodes (IEEE 802.11 stations) within the inet framework?
And if so, how is the resource reservation realized on the medium access level?
The point coordination function (PCF) is still not implemented so far: https://github.com/inet-framework/inet/blob/master/src/inet/linklayer/ieee80211/mac/coordinationfunction/Pcf.h#L40
2020-04-21 UPDATE:
Avnu has a white paper on the concept: https://avnu.org/wireless-tsn-paper/
Avnu Alliance members... wrote the Avnu Alliance Wireless TSN – Definitions, Use Cases & Standards Roadmap white paper in an effort to generate awareness and start defining work required in Avnu to enable wireless TSN extensions in alignment with wired TSN systems and operation models.
2018-04-03 UPDATE:
IEEE Std. 802.1AS-2011 (gPTP) has a clause 12 titled "Media-dependent layer specification for IEEE 802.11 links." So, yes, it seems time synchronization is possible over WIFI, and is in fact defined in an IEEE standard.
2017-12-13 UPDATE:
It looks like the OpenAvnu project has been working on this idea. Check out this pull request, which seems to implement the precision time-stamping required for AVB on a WIFI connection.
OpenAvnu Pull Request #734: "Added wireless timestamper and port code"
This should probably be asked in multiple questions, with each question relating to the implementation of one of the core AVB/TSN protocols on a WIFI (802.11) network. Audio video bridging (AVB) and time sensitive networking (TSN) are not IEEE standards or protocols. What we call AVB or TSN (I'm just going to use AVB from now on) is a singular name for the use and implementation of multiple IEEE standards in order to achieve real-time media transfer.
These core protocols are:
IEEE Std. 802.1BA-2011: Profiles and configurations of IEEE standards which define what an AVB endpoint or bridge needs to do. This is the closest we get to one single standard for AVB.
IEEE Std. 1722(-2016): A Layer 2 audio video transport protocol (AVTP)
IEEE Std. 1722.1(-2013): Audio video discover, enumeration, connection management and control (AVDECC) for 1722-based devices
IEEE Std. 802.1AS(-2011): Generalized precision time protocol (gPTP)
IEEE Std. 802.1Q(-2014): FQTSS and SRP
(note that according to the IEEE TSN webpage, currently published TSN-specific standards will be rolled into 802.1Q, so the list above should still be accurate)
Because stream reservation (SRP), timing (gPTP), and media transport (1722 or 1723) are independent, your question should probably be asking about them independently.
How can/should IEEE 802.1AS (gPTP) be implemented in a WIFI (802.11) network?
How can/should IEEE 802.1Q (SRP and FQTSS) be implemented in a WIFI network?
1. I have nowhere near the experience these standards developers have, and some of them have explored gPTP on WIFI extensively. The "how" of gPTP is well explained by Kevin Stanton of Intel here.
And for WIFI in particular, Norman Finn from Cisco had some notes on using gPTP with WIFI networks here.
I couldn't find anything that explicitly laid out how best to use/implement gPTP with WIFI. Ethernet is really where a lot of this is happening right now.
2. Craig Gunther from Harman says:
Simply implement[ing] the SRP control protocol without performing the related reservation actions. ... 802.11ak along with 802.1Qbz may make this much simpler. .... 802.11ac and 802.11ad have created some interesting new technologies that may help with reservations...
Source: http://www.ieee802.org/1/files/public/docs2014/at-cgunther-srp-for-802-11-0114-v01.pdf
Personally, I feel like guaranteed latency and reliability are very hard to ask for with a network that has to do things like carrier-sense multiple access with collision avoidance (CSMA/CA), but that's just my inexperienced opinion. It certainly would be cool, but it seems very... challenging.
I would like to reproduce the experiment from Dr. Adrian Thompson, who used genetic algorithm to produce a chip (FPGA) which can distinguish between two different sound signals in a extreme efficient way. For more information please visit this link:
http://archive.bcs.org/bulletin/jan98/leading.htm
After some research I found this FPGA board:
http://www.terasic.com.tw/cgi-bin/page/archive.pl?Language=English&CategoryNo=167&No=836&PartNo=1
Is this board capable of reproducing Dr. Adrian Thompsons experiment or am I in need of another?
Thank you for your support.
In terms of programmable logic, the DE1-SoC is about ~20x bigger, and has ~70x as much embedded memory. Practically any modern FPGA is bigger than the "Xilinx XC6216" cited by his papers, as was linked to you in the other instance of this question you asked.
That said, most modern FPGAs don't allow for the same fine-granularity of configuration, as compared to older FPGAs - the internal routing and block structures are more complex, and FPGA vendors want to protect their products and compel you to use their CAD tools.
In short, yes, the DE1-SoC will be able to contain any design from 12+ years ago. As for replicating the specific functions, you should do some more research to determine if the methods used are still feasible with modern chips and CAD tools.
Edit:
user1155120 elaborated on the features of the XC6216 (see link below) that were of value to Thompson.
Fast Configuration: A larger device will generally take longer to configure, as you have to send more configuration data. That said, I/O interfaces are faster than they were 15 years ago, so it depends on your definition of "fast".
Reconfiguration: Cyclone V chips (like the one in the DE1-SoC) do support partial reconfiguration, but the subscription version of the Quartus II software is required, in addition to a separate license to support PR. I don't believe it supports wildcard reconfiguration, though I could be mistaken.
Memory-Mapped Addressing: The DE1-SoC's internal data can be access through the USB Blaster interface. However, this requires using the SystemConsole on the host PC, so it's not a direct access.
I'm interested in starting a hobbyist project, where I do some image processing by interfacing HW and SW. I am quite a newbie to this. I know how to do some basic image processing in Matlab using the existing image processing commands.
I personally enjoy working with HW and wanted to a combination of HW/SW to be able to do this. I've read articles on people using FPGAs and just basic FPGAs/micro-controllers to go about doing this.
Here is my question: can someone recommend languages I should consider that will help me with interfacing on a PC? I image, the SW part would essentially be a GUI and is place-holder for all the processing that is done on the HW. Also in-terms of selecting the HW and realistically considering what I could do on the HW, could I get a few recommendations on that too?
Any recommendations will be appreciated!
EDIT: I read a few of the other posts saying requirements are directly related to knowing what kind of image processing one is doing. Well initially, I want to do finger print recognition. So filtering and locating unique markers in the image etc.
It all depends on what you are familiar with, how you plan on doing the interface between FPGA and PC, and generally the scale of what you want to do. Examples could be:
A fast system could for instance consist of a Xilinx SP605
board, using the PCI Express interface to quickly transfer image
data between PC and FPGA. For this, you'd need to write a device
driver (in C), and a user-space application (I've done this in
C++/Qt).
A more realistic hobbyist system could be a Xilinx SP601
board, using Ethernet to transfer data - you'd then just have to
write a simple protocol (possibly using raw sockets (no TCP/UDP) to
make the FPGA side Ethernet simpler), which can be done in basically
any language offering network access (there's a Xilinx reference
design for the SP605 demonstrating this).
The simplest and cheapest solution would be an FPGA board with a
serial connection - you probably wouldn't be able to do any
"serious" image processing with this, but it should be enough for
very simple proof-of-concept stuff, although the smaller FPGA devices used o these boards typically do not have much on-board memory available.
But again, it all depends on what you actually want to do.
I have a raw grabbed data from spectrometer that was working on wifi (802.11b) channel 6.
(two laptops in ad-hoc ping each other).
I would like to decode this data in matlab.
I see them as complex vector with 4.6 mln of complex samples.
I see their spectrum quite nice. I am looking document a bit less complicated as IEEE 802.11 standard (which I have).
I can share measurement data to other people.
There's now a few solutions around for decoding 802.11 using Software Defined Radio (SDR) techniques. As mentioned in a previous answer there is software that is based on gnuradio - specifically there's gr-ieee802-11 and also 802.11n+. Plus the higher end SDR boards like WARP utilise FPGA based implementations of 802.11. There's also a bunch of implementations of 802.11 for Matlab available e.g. 802.11a.
If your data is really raw then you basically have to build every piece of the signal processing chain in software, which is possible but not really straightforward. Have you checked the relevant wikipedia page? You might use gnuradio instead of starting from scratch.
I have used 802.11 IEEE standard to code and decode data on matlab.
Coding data is an easy task.
Decoding is a bit more sophisticated.
I agree with Stan, it is going to be tough doing everything yourself. you may get some ideas from the projects on CGRAN like :
https://www.cgran.org/wiki/WifiLocalization