Sagepay repeat payments using CV2 number [closed] - opayo

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 6 years ago.
Improve this question
When performing a repeat payment we need to provide the CV2 number because SagePay does not store it. As we are taking the payments automatically (overnight task) it means we need to store the number in our database (as the customer will not be typing it in).
We also store the last 4 digits of the card number and the expiry date for information.
Does any of the above make us non-PCI compliant, or does it not matter because we are not storing the actual card details on our server?

You are not allowed to store the CV2 number post auth. That's a big no-no.
Sage Pay will accept a REPEAT payment (using a CA MID) without the CV2. Your acquiring fee will be based on the fact that you are using a CA

Concur. Storing the CVV is a big no-no and if the credit card companies suspect you are doing this, they will precipitate a forensic audit (they will send an investigator our to clone your hard drives and will take a forensic look at your database) that will cost your client at least $10k (and a fine, too). All major, effective payment processors furnish you with an approved method to do recurring payments without presenting CVV on charges after the first, initiating charge when the customer is online and their first charge is checked - the CVV being checked without being stored - research that method and make use of the methods provided - otherwise, the consequences for your client can be rather severe...

Related

Which Google places API query results are allowed to be stored in a database? [closed]

Closed. This question is not about programming or software development. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed last month.
Improve this question
I am exploring the Google APIs, mostly the Places API. Because the number of requests to the Google Places API is limited to 100,000, I am looking for ways to minimize the number of requests sent to the API. I was thinking of using a database to store previously received responses, so in the future I could retrieve them without making requests to the API and will do request to the API only in a case if the needed data had not been previously stored in my database.
According to Google API terms of use, specifically the section 10.1.3 Restrictions against Copying or Data Export, it is not allowed to store data indefinitely, but it is legal to cache it temporarily:
You must not pre-fetch, cache, or store any Content, except that you may store: (i) limited amounts of Content for the purpose of improving the performance of your Maps API Implementation if you do so temporarily (and in no event for more than 30 calendar days), securely, and in a manner that does not permit use of the Content outside of the Service; and (ii) any content identifier or key that the Maps APIs Documentation specifically permits you to store. For example, you must not use the Content to create an independent database of "places" or other local listings information.
I find this section not well explained. Can I store any data received by the API in my database for 30 days or only ids of places? Because in some other contexts I have read that it is only allowed to store the ids. I understood it this way: I can store places ids indefinitely, but can use the whole data only for 30 days.
Because I have been reading about Google APIs for only a couple of days, I have maybe missed some term of use, so I would be really thankful if you could help me.
Any suggestions how to minimize the number of calls to the APIs, or sharing some experiences related to real projects using those APIs will be really appreciated. Also if you could suggest me some alternative APIs which could provide similar functionality would be really helpful.
Thank You in advance!
From my experience with the Google Places API, your understanding is just about correct. Let me explain the two stipulations in my own words:
i) Without prefetching, or redistributing outside your application, you may cache API results for up to 30 days.
ii) You can use a place ID or key in your application specific data, but nothing else (e.g. if your app lets user's "check-in" places, you can store a list of place IDs where they've been on a user object and lookup the places as needed by ID, but you can't store a list of all the places with Google's names/details).
In order to reduce the number of API calls and accelerate my app, what I do is cache the nearby place calls in a simple key-value cache, where the key is the lat-lng pair rounded to a certain precision (so that calls within a certain radius will hit the cache) and the value is the entire JSON result string. Here is my code, which is Java running on Google's App Engine:
// Using 4 decimal places for rounding represents approximately 11 meters of precision
// http://gis.stackexchange.com/questions/8650/how-to-measure-the-accuracy-of-latitude-and-longitude
public static final int LAT_LONG_CACHE_PRECISION = 4;
public static final int CACHE_DURATION_SEC = 24 * 60 * 60; // one day in seconds
...
String cacheKey = "lat,lng:" + round(latitude) + "," + round(longitude);
asyncCache.put(cacheKey, dataJSON, Expiration.byDeltaSeconds(CACHE_DURATION_SEC), MemcacheService.SetPolicy.SET_ALWAYS);
...
private static double round(double value) {
BigDecimal bd = new BigDecimal(value);
bd = bd.setScale(LAT_LONG_CACHE_PRECISION, RoundingMode.HALF_UP);
return bd.doubleValue();
}
As for alternative APIs, I would suggest you look at the following:
Yelp API - provides bar/restaurant data that Google lacks
Facebook API - easy to use if you're already using Facebook's SDK
Factual: Places Crosswalk - aggregates and normalizes places data from many sources, including Facebook and Yelp, but not Google
Currently I'm only using Google Places API, but I'm planning on adding Yelp or Factual later to improve the results for the end user.

Google play app store optimization tips [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 7 years ago.
Improve this question
I just published my game on google two weeks ago and not getting installs. How can i promote or make it to come in search results or players find my game?
For you instant help i have only keyword suggestions for you. Did you do keyword research before uploading your game? if no then go with it
Finding The Best Keyword For Your App (ASO)
According to MobileDEVHQ test, apps with keywords inside title ranked on the average 10.3% greater than those without having a keyword in the title.
Unlike Apple store, Google does not have a keyword text field for you to put words that you think it can be important for the players to find your app.
However, Google will user your words in your app description. It means that everything you write about your app can be used to generate results in search. Therefore, I personally recommend you to pay more attention to your app description.
On the other hand, your App Title and Company Name have high weight in search results as well.
Hope this helps.

retrieving subset of FHIR resource [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 4 years ago.
Improve this question
All,
I'm interested in the ability to retrieve a specific element within a FHIR resource using a single URL call. For example, suppose I'm interested in the gender of my patients. I would read the using the URL, without having to walk the XML node path every time. Right now, this functionality does not appear to exist. What do you think about the usefulness of this? Would like to get a sense of the community interest. Thanks.
-Jeff
For the default query mechanism, you can't bring anything back other than the full resource. (And don't even have a guarantee that the desired element will be present on all instances of the resource unless that element was part of your search criteria - in which case, why bother asking? :>). There's a new mechanism for defining custom queries. Refer to _query in the search/query section of the FHIR spec. However, it's not clear whether this will allow retrieval of anything other than full resource instances either.
This functionality does not exist at this time. It's on the wishlist, and we're trying to decide whether we can frame it in a sensible and safe fashion. The case you describe is relatively obvious, but many others aren't. And, in fact, when I think about it, it's not really clear to me how it works. what do you get back? just the gender element? so the server needs to - in effect - do the node walk for you, and you get, instead, to deal with a profusion of different schemas. It's not really obvious to me that this is a net saving for the client, and it's certainly a greater cost for the server.

How to look up elevation data by lat/lng [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
I am planning an app that will need the ability to look up the elevation of geographic points by lat/lng. Ideally I would like something that would work worldwide, but US-only would also suffice. I have looked at using the USGS Elevation Query Web Service, however it only allows you to query for one point at a time, and I will need to look up several hundred, and possibly up to several thousand. I also considered downloading & hosting the National Elevation Dataset myself, but it's almost 100 gigs, and apparently the USGS only allows you to download 1.5 gigs at a time.
Can anyone familiar with GIS recommend a good solution for me? I'm looking for something as lightweight & simple as possible. I am completely new to GIS, so I would really appreciate suggestions on where to get the data, how to store it, and how/what to use when working with it.
Thanks in advance.
EDIT: Just to clarify, the data points I need are not predetermined. They are arbitrary points chosen by the user (by interacting with a google maps mashup), so I do need to be able to query for any point, not just a small subset.
EDIT 2: If there is no lightweight or simple solution, I'll take whatever I can get =)
I'll give you one of the best "secrets" that I learned throughout the years after going through many different pains (leeching scripts, manual clicking, etc). It is an old-school trick... contact a real person there!
The best way do get the NED elevation dataset is to contact USGS's Eros group directly at bulkdatainfo_at_usgs.gov
You send them an external drive and after 4 to 8 weeks (usually much less than that) they will send you the entire dataset that you requested.
Then use GDAL to query your points in a way similar to this example. You may want to read the Affine Geotransform section of the GDAL Data Model
I recently stumbled in to this question while doing research. I don't have a complete simple answer either, but there are some other options not listed in the answers so far:
Google Elevation API: 25,000 requests/day, limited to Google Maps applications
Lat/Lon to Elevation: 2 points / second
GPSVisualizer: no published speed, but not intended for general DEM use
Shuttle Radar Topography: alternative to the NED, 7 gigabytes for the US.
The USGS Elevation Query Web Service only allows one query at a time, but it allows you to make requests with SOAP, HTTP GET, or HTTP POST. Choose your favorite language and write a script to generate requests for each of your data points.
I bought a GEOIP database and store every single post long lat data in mysql. After that i just implement the Google Map API by passing dynamic longlat data to the Google MAP. What I get is all my post shown in the Google Map and also display nearest post from a certain location
What you need is a GEOIP database, A query that calculate distance in miles base on given longlat, Google API, PHP Dynamic passing API variables.
example of my site : Matchimedia.com.

How do you perform address validation? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 2 years ago.
The community reviewed whether to reopen this question 11 months ago and left it closed:
Original close reason(s) were not resolved
Improve this question
Is it even possible to perform address (physical, not e-mail) validation? It seems like the sheer number of address formats, even in the US alone, would make this a fairly difficult task. On the other hand it seems like a task that would be necessary for several business requirements.
Here's a free and sort of "outside the box" way to do it. Not 100% perfect, but it should reject blatantly non-existent addresses.
Submit the entire address to Google's geocoding web service. This service attempts to return the exact coordinates of the location you feed it, i.e. latitude and longitude.
In my experience if the address is invalid you will get a result of 602 from the service. There's definitely a possibility of false positives or false negatives, but used in conjunction with other consistency checks it could be useful.
(Yahoo's geocoding web service, on the other hand, will return the coordinates of the center of the town if the town exists but the rest of the address is bogus. Potentially useful as long as you pay close attention to the "precision" field in the result).
There are a number of good answers in here but most of them make the assumption that the user wants an "API" solution where they must write code to connect to a 3rd-party service and/or screen scrape the USPS. This is all well and good, but should be factored into the business requirements and costs associated with the implementation and then weighed against the desired benefits.
Depending upon the business requirements and the way that the data is received into the system, a real-time address processing solution may be the best bet. If a real-time solution is required, you will want to consider the license agreement and technical limitations of the Google Maps/Bing/Yahoo APIs. They typically limit the number of calls you can make each day. The USPS web tools API is the same in additional they restrict how/why you can use their system and how you are allowed to use the data thereafter.
At the same time, there are a handful of great service providers that can easily process a static list of addresses. Essentially, you give the service provider a CSV file or Excel file, they clean it up and get it back to you. It's a one-time deal with no long-term commitment or obligation—usually.
Full disclosure: I'm the founder of SmartyStreets. We do address verification for addresses within the United States. We are easily able to CASS certify a list and we also offer a address verification web service API. We have no hidden fees, contracts, or anything. You use our service until you no longer need it and you can walk away. (Unlike cell phone companies that require a contract.)
USPS has an address cleaner online, which someone has screen scraped into a poor man's webservice. However, if you're doing this often enough, it'd be a better idea to apply for a USPS account and call their own webservice.
I will refer you to my blog post - A lesson in address storage, I go into some of the techniques and algorithms used in the process of address validation. My key thought is "Don't be lazy with address storage, it will cause you nothing but headaches in the future!"
Also, there is another StackOverflow question that asks this question entitled How should international geographic addresses be stored in a relational database.
In the course of developing an in-house address verification service at a German company I used to work for I've come across a number of ways to tackle this issue. I'll do my best to sum up my findings below:
Free, Open Source Software
Clearly, the first approach anyone would take is an open-source one (like openstreetmap.org), which is never a bad idea. But whether or not you can really put this to good and reliable use depends very much on how much you need to rely on the results.
Addresses are an incredibly variable thing. Verifying U.S. addresses is not an easy task, but bearable, but once you're going for Europe, especially the U.K. with their extensive Postal Code system, the open-source approach will simply lack data.
Web Services / APIs
Enterprise-Class Software
Money gets it done, obviously. But not every business or developer can spend ~$0.15 per address lookup (that's $150 for 1,000 API requests) - a very expensive business model the vast majority of address validation APIs have implemented.
What I ended up integrating: streetlayer API
Since I was not willing to take on the programmatic approach of verifying address data manually I finally came to the conclusion that I was in need of an API with a price tag that would not make my boss want to fire me and still deliver solid and reliable international verification results.
Long story short, I ended up integrating an API built by apilayer, called "streetlayer API". I was easily convinced by a simple JSON integration, surprisingly accurate validation results and their developer-friendly pricing. Also, 100 requests/month are entirely free.
Hope this helps!
I have used the services of http://www.melissadata.com Their "address object" works very well. Its pricey, yes. But when you consider costs of writing your own solutions, the cost of dirty data in your application, returned mailers - lost sales, and the like - the costs can be justified.
For us-based address data my company has used GeoStan. It has bindings for C and Java (and we created a Perl binding). Note that it is a commercial product and isn't cheap. It is quite fast though (~300 addresses per second) and offers features like CASS certification (USPS bulk mail discount), DPV (Delivery point verification) flagging, and LON/LAT geocoding.
There is a Perl module Geo::PostalAddress, but it uses heuristics and doesn't have the other features mentioned for GeoStan.
Edit: some have mentioned 'doing it yourself', if you do decide to do this, a good source of information to start with is the US Census Tiger Data Set, which contains a lot of information about the US including address information.
As seen on reddit:
$address = urlencode('1600 Pennsylvania Avenue, Washington, DC');
$json = json_decode(file_get_contents("http://where.yahooapis.com/geocode?q=$address&flags=J"));
print_r($json);
Fixaddress.com service is available that provides following services,
1) Address Validation.
2) Address Correction.
3) Address spell correcting.
4) Correct addresses phonetic mistakes.
Fixaddress.com uses USPS and Tiger data as reference data.
For more detail visit below link,
http://www.fixaddress.com/
One area where address lookups have to be performed reliably is for VOIP E911 services. I know companies reliably using the following services for this:
Bandwidth.com 9-1-1 Access API MSAG Address Validation
MSAG = Master Street Address Guide
https://www.bandwidth.com/9-1-1/
SmartyStreet US Street Address API
https://smartystreets.com/docs/cloud/us-street-api
There are companies that provide this service. Service bureaus that deal with mass mailing will scrub an entire mailing list to that it's in the proper format, which results in a discount on postage. The USPS sells databases of address information that can be used to develop custom solutions. They also have lists of approved vendors who provide this kind of software and service.
There are some (but not many) packages that have APIs for hooking address validation into your software.
However, you're right that its a pretty nasty problem.
http://www.usps.com/ncsc/ziplookup/vendorslicensees.htm
As mentioned there are many services out there, if you are looking to truly validate the entire address then I highly recommend going with a Web Service type service to ensure that changes can quickly be recognized by your application.
In addition to the services listed above, webservice.net has this US Address Validation service. http://www.webservicex.net/WCF/ServiceDetails.aspx?SID=24
We have had success with Perfect Address.
Their database has all the US street names and street number ranges. Also acts as a pretty decent parser for free-form address fields, if you are lucky enough to have that kind of data.
Validating it is a valid address is one thing.
But if you're trying to validate a given person lives at a given address, your only almost-guarantee would be a test mail to the address, and even that is not certain if the person is organised or knows somebody at that address.
Otherwise people could just specify an arbitrary random address which they know exists and it would mean nothing to you.
The best you can do for immediate results is request the user send a photographed / scanned copy of the head of their bank statement or some other proof-of-recent-residence, because at least then they have to work harder to forget it, and forging said things show up easily with a basic level of image forensic analysis.
There is no global solution. For any given country it is at best rather tricky.
In the UK, the PostOffice controlls postal addresses, and can provide (at a cost) address information for validation purposes.
Government agencies also keep an extensive list of addresses, and these are centrally collated in the NLPG (National Land and Property Gazetteer).
Actually validating against these lists is very difficult. Most people don't even know exactly how their address as it is held by the PostOffice. Some businesses don't even know what number they are on a particular street.
Your best bet is to approach a company that specialises in this kind of thing.
Yahoo has also a Placemaker API. It is good only for locations but it has an universal id for all world locations.
It look that there is no standard in ISO list.
You could also try SAP's Data Quality solutions which are available in both a server platform is processing a large number of requests or as an embeddable SDK if you wanted to run it in process with your application. We use it in our application and it's very robust and scalable.
NAICS.com is coming out with an API that will add all kinds of key business data including street address. This would happen on the fly as your site's forms are processed. https://www.naics.com/business-intelligence-api/
You can try Pitney Bowes “IdentifyAddress” Api available at - https://identify.pitneybowes.com/
The service analyses and compares the input addresses against the known address databases around the world to output a standardized detail. It corrects addresses, adds missing postal information and formats it using the format preferred by the applicable postal authority. I also uses additional address databases so it can provide enhanced detail, including address quality, type of address, transliteration (such as from Chinese Kanji to Latin characters) and whether an address is validated to the premise/house number, street, or city level of reference information.
You will find a lot of samples and sdk available on the site and i found it extremely easy to integrate.
For US addresses you can require a valid state, and verify that the zip is valid. You could even check that the zip code is in the right state, but beyond that I don't think there are many tests you could run that wouldn't provide a lot of false negatives.
What are you trying to do -- prevent simple mistakes or enforcing some kind of identity check?

Resources