Is there a cryptographic mechanism by which it is possible to sign a document with a date, such that it is not possible to forge that same signature at a later date? Maybe some sort of server that publishes daily cryptographic keys (but how can you trust them? ;-).
For the inevitable prodding pragmatists, I'm not trying to accomplish some task. I'm just curious what the solution space is like.
This is called Timestamping (TSP protocol, RFC 3161). Different digital signature standards (PDF and XML signatures, CAdES, PAdES, XAdES) include support for advanced timestamping based on TSP.
MS Authenticode also includes timestamping, but uses different (incompatible and less secure) mechanism for it.
TSP alone (without signature protocols) is not used a lot, but in conjunction with signature standards it becomes very handy.
GuardTime has an interesting service, where the timestamp does not depend on a trusted third party ("signed time") but can be independently verified by any interested party. It works somewhat similarly as Bitcoins, based on hash trees.
Related
I am working with informal systems on a tendermint light client to be used in the ibc protocol in parachains, we need use the polkadot host function for ed25519 signature verification for performance benefits, but we need substrate to migrate to ed25519-consensus instead of ed25519-dalek due to some inconsistencies in signature validation criteria in the ed25519-dalek's implementation of the ed25519 scheme, the issues are better explained in this article. https://hdevalence.ca/blog/2020-10-04-its-25519am
More details are specified in the readme here. https://github.com/penumbra-zone/ed25519-consensus
I see a lot of videos showing withdrawals from ATMs with cloned EMV cards like code 201 so
how can be possible when a card uses DDA (dynamic keys) another question is cloned card just for magstripe because it is unencrypted or even EMV chip
It is a question that does not really belong to SO and due to the nature of what you are asking about, you may not receive a detailed answer. I will still try.
First of all, ATMs are online-only devices that do not need any form of Offline Data Authentication so DDA has little to do with them normally (there are exceptions from this rule, as usual).
There are still dynamic keys that are meant to provide security, nevertheless. A standard symmetric-key algorithm is used to generate online cryptogram and it is validated by the issuer. Symmetric keys are individual to each specific card and are not easily extracted (of course, nothing can be ever treated as 100% secure, but it would require a complex hardware attack to extract keys from a single card).
I assume your question about Service code 2xx, 5xx or 6xx is mostly revolving around magstripe data with no chip data available. In some situations (i.e. when card is mute) a fallback to magstripe transaction may happen. Normally, unattended devices should have this option blocked and decline such attempts but I would not bet there are no such devices around the world. You also need to consider that there are still devices that are not EMV-capable.
When it comes to magstripe data, they can be easily modified (for instance changing the service code) although such modification should be detectable. Same goes for using EMV track equivalent data on magstripe. In both cases, issuer is capable of detecting modification of the data or using it on different interface through the use of CVC/CVV which is encoded on a track and is created to cryptographically protect integrity of the track data. However, this requires to have proper implementation on the issuer side to detect and decline such attempts where cryptographic data from CVV or cryptogram are incorrect.
I have a very large block of code (few seconds to crypt).
I use KeyA to encrypt it.
later in the process, i receive a key (not necessarily KeyA)...
but i don't need to open the block yet,
what i really need, is to validate that this is really the Key that will open the code correctly.
I Assumed i can keep a known block, and encrypt it,
and in order to validate the key, only open it, but it feels like weakenning the power of the cryptography (brute-force is easier, one can learn few things about the key properties).
Does my assumption really weakening the chipher? why yes/why no?
Is there a different way to ensure the match of a key without opening the whole block.
I am assuming you are using Symmetric-Key Cryptography (the kind where the key used to decrypt the file is the same as the one used to encrypt it).
If the cipher is vulnerable to a Known-Plaintext Attack, then the known block of plaintext may reveal information about the key. The stream cipher used for ZIP files suffered from this problem. Because ZIPs are compressed, it was difficult to guess enough plain-text, but the checksum used to verify passwords (among other factors) helped provide sufficient plain-text for a practical attack.
In principle you could publicize the hash of KeyA (assuming that the hash algorithm is strong enough that it cannot be reversed, and that the hash algorithm isn't also used internally by the cipher). This would allow you to quickly reject invalid keys without changing the way the message is encrypted.
Taking this idea further, you could use a Message authentication code such as HMAC. A message authentication code will validate that the message (in this case your very large block of code, or perhaps just its file path) has not been tampered with, as well as validating that the key is correct.
If you are concerned that this will make brute force easier or expose properties of the key, you could split the key into two parts. The first part of the key could be purely for validation, and the second part purely for decryption. e.g. MyKey = AuthenticationPart,DecryptionPart
(Disclaimer: This is based on my very incomplete understanding of crypto. You might get better responses from the experts on security.stackexchange.com and/or crypto.stackexchange.com)
I have been starting to see uggc/uggcf (rot-13 encoded http/https) links show up in our system.
Are these worth supporting, is there actually a demand for it? The IETF document (link) has not been touched since 2001 and I cannot find much information on them at all.
Is there an area of the world where this is more common? I've only noticed them since we went world-wide.
The document describes it as a method to 'secure' the url as well as the data. What is the value of rot-13 encoding the data if it can be reversed without a key? HTTPS handles all of this, except for the domain itself.
I know this is an old answer but this is an interesting topic so I'll try to answer your question:
The "Encrypted Hypertext Transfer Protocol -- UGGC/1.0" specification, is an April Fools RFC. The IETF releases these almost yearly on 1st April, and ROT13 "encryption" would be pointless, since by knowning the encryption algortihm, you would be able to "decrypt" the message, in this case the URL.
So no, it's not worth supporting, and it does not provide any "serious" protection. The only usage I've seen is in some CTFs or hacking/crypto challenges.
Together with another developer, I have embarked on a journey to create a hosted 'CRM Style' application that will cater to enterprise level businesses. These businesses will be accessing our application remotely and so the hosted nature of the application will require certain features. For example, to guarantee a level of professional service the following things must be true:
internationalization requires multiple languages and presentation of date/time for various timezones and locales
transactional capability for batch processing of tasks and rollback capabilities
security concerns for keeping data safe and remote invocations secure from attack
etcetera, the list goes on and on
Due to these concerns and my role as the developer most responsible for the server side development, I am very interested in the choices I make early on. Regarding timezones and languages for example, are there issues related to my choice of database or data fields? Do I choose to use a UTC timestamp or date field throughout the application and if so is there a standard format for that? Also, regarding different languages, am I supposed to ensure the data is stored in the database as UTF-8 or unicode?
I really want to avoid laying down the infustructure of the system only to discover later that a fundamental decision was incorrect or not big enough, wide enough, smart enough, etc. Can someone point me in the right direction regarding these basic 'early' decisions?
EDIT _ Ok I appreciate the broad responses and now I see my question was a little too non-specific. I'd like to focus on the more specific elements that WERE present in the question, such as how to choose the proper format for storing a UTC Date/Time or how to save my text data (do I specify a UTF format?)
If you are targeting enterprise CRM, then you will need a very high level of customizability and integrations with all kinds of systems. You will make mistakes in the design. Your only hope is to isolate each little piece of the code so that you can have a chance of fixing it later.
In short, basic software engineering principles are your best bet.
What you are discussing is called a multi-tenant application wherein you have the same code base used by multiple customers (tenants) with logical or physical separation of data. Remember the fundamental rule of development: flexibility is relational to complexity. The more flexible you make the system, the more complicated it will be.
RE: UTC
For a CRM application that stores things like when calls were made and when meetings took place, I would definitely store all those in UTC and let the user set their local timezone. However, you might run into dates which are timezone agnostic and for those, I would store whatever date was entered.
RE: Unicode
Yes, I would use Unicode for all user-entered data. However, that will not get you localization. If for a single company for example, you have a user in Hong Kong entering text in Chinese and user in Amsterdam entering text in Dutch, you are not going to get automatic translation. Things like dates and number formats can be localized, but raw text like names used in drop lists and such can be a chore to localize.
As you have not mentioned what you think about the issue, you may find my answer or parts of it rather basic.
If you don't need to, don't use a low-level language. I'd use python usually for the first version of a CRM application (with the hope that it would be good enough for the next versions), but this decision depends also on the domain community.
Try to write the minimal code on your own, instead relying on the third-party libraries. People may disagree on this, but I would write the code myself as the last option. But the next point is important.
When selecting a library/framework to use, make sure the party behind it is going to last, the library is stable and the software license suits you needs.
Other general rules apply: focus on the customer, use continuous integration/testing, etc., use good software practices like logging etc.
Nothing is ever stored as "unicode" because this is an abstract concept. Unicode is always stored in some kind of unicode transformation format (UTF) (well or UCS but I never saw that used somewhere). The most commonly used UTF is UTF-8 but I suggest to use what is native/default to your platform. wikipedia