linkedin api company search results for location - location

I can't seem to find this in the documentation, its probably there, I just can't see it.
I make a request like below, but I want to limit the results to a geographical region. Is there a query parameter I can use to do this? Something like location={something}?
https://api.linkedin.com/v1/company-search:(facets,companies:(name,description,square-logo-url,website-url))?keywords=something&oauth2_access_token=xxxxxxx&format=json

I worked it out.
https://api.linkedin.com/v1/company-search:(facets,companies:(name,description,square-logo-url,website-url,locations))?oauth2_access_token=xxxxxxx&keywords=company+name&facet=location%2Cau%3A0&count=1&format=json
(you can remove the count=1 that was for my purposes)
so basically you would need to use an encoded geographical location code
so in my URl I'm targeting Australia which is au:0 or us:0 for the US.
You can also target a geographical location by changing the 0 to another number you can most of these numbers from here. So us:84 would be the San Francisco Bay Area.

Related

Optimize Google Places API Query for Prominent Parks, Mountains, Conservation Areas

First post on Stackoverflow.
I am using the Google API to sort images taken while traveling into organized folders, append tags and rename files with relevant information. I have my code working well but am not always happy with the results. I want to be able to focus my query results on major tourist attractions such as National Parks, Ski Resorts, Beaches, etc. The problem I am finding is that the prominence "rankby" variable and the "radius" are not giving satisfactory results. Here is a typical query for Zion National Park.
https://maps.googleapis.com/maps/api/place/nearbysearch/json?location=37.269486111111,-112.948141666667&rankby=prominence&radius=50000&type=natural_feature,tourist_attraction,point_of_interest&keyword=&key=MYAPIKEY
The most prominent result is Springdale which is the town where you enter the part. Zion National Park is listed much further down in the results. What my code does is use the LAT and LON extracted using EXIF and does a Google API nearby search request to find the Place ID for where the photo was taken. It then does another API request for Place Details using the place_id provided by the previous step to cut down on the information I need to parse.
https://maps.googleapis.com/maps/api/place/details/json?place_id=ChIJ8R5RCzaNyoARegi3rqVkstk&fields=name,address_component&key=MYAPIKEY
I can force the nearby search to return a National Park by searching against "National Park" in the keywords variable but that limits my project to only being able to provide National Park results since the keywords field can only accept one string.
I would like a park of my query to be able to return the most prominent tourist attraction at the general level, i.e. Zion National Park, Yosemite National Park, etc. so I can sort images into the general name folders and another part of the query provides the exact location. i.e. I am on this trail or at this lookout. The problem is the Google API sees these specific locations "Trail, Lookout" as tourist attractions, parks, establishments, etc. as well so it chooses those first.
What I need help with is trying to figure out if there is a better way to structure my query to return the high-level / name of the major park. From my understanding, the types field only searches on the first type even if there is more in the list and the keywords field can only accept one string as well making it impossible for one phase to capture all major destinations at a high level.
Perhaps it needs to be done with more queries but I am trying to limit the number of queries to stay inside the free quota. Maybe it will just take a long time to fully sort my files.
Read through and implemented Google API structure. I hoping someone can provide a more detailed query structure or method to parse out truly prominent locations rather than googles interpretation of prominence as it can be affected by user ratings, etc. It is not always accurate.

google matrix api zip code origin address change automatically

https://maps.googleapis.com/maps/api/distancematrix/json?units=imperial&origins=53022,DC&destinations=35210&key=xxx
origin changes to 53170
what to do to fix it.
So, I recreated your issue and found that it does change to 53170 but I also see that it refers to an location in france.
But when I specify in both origin and destination the country:
origins=53022, USA
destinations=35210, USA
this is what i get:
"destination_addresses":[
"Birmingham, AL 35210, USA"
],
"origin_addresses":[
"Germantown, WI 53022, USA"]
even thou the numbers stays correct I see that is not in DC.
My advice is the more specific your addresses are, the better results from google you will have, your query is no very specific so google try to make sense of it to its own way and give you thouse results.
Happy Coding :)

Information Retrieval Get place name by image

I am starting the development of a software in which through an image of a touristic spot (for example: San Peter Basilica, the Colosseum, etc.) I should retrieve which is the name of the spot (plus its related information). In addition to the image I will have with me the picture coordinates (embedded as metadata). I know I can support me with Google Images API using reverse search in which I give my image as an input, and I will have as a response a big set of images.
However, my advice request for you, is that now having all the similar images, which approach can I make in order to retrieve the correct place name which is in the photo.
A second approach that I am managing is to construct my own dataset in my database, and do my own heuristic (filtering images by their location and then to make the comparation over the resulting subset after having done that filtering). Suggestions and advices are heard, and thanks in advance.
An idea is to use the captions of the images (if available) as a query, retrieve a list of candidates and make use of a structured knowledge base to deduce the location name.
The situation is lot trickier if there're no captions associated with the images, in which case, you may use the fc7 layer output of a pre-trained convolutional net and query into the ImageNet to retrieve a ranked list of related images. Since those images have captions, you could again use them to get the location name.

find a single specific place with google places api

I was wondering if it is possible to find a specific place using the google places api.
I know the name of the place, the address or website url, and the coordinates.
I need this to get the ratings this place has.
Is this possible? If not, is it going to be?
I think your best bet would be to do a nearbysearch with the location (lat,long), a small radius, and the name and types parameters to narrow it down. If you are targeting a specific place, then you can just manually find it in the results and use its reference for a Details request in your solution.
If the target place can be dynamic, for example based on user input, then you might want to show the user the list of results and let them choose the correct one. I don't think there's a way to guarantee that you will always get exactly the result you're looking for as, say, the first result in the list. Experiment with different types of requests and parameters and try to get a sense for the behaviour of the responses to find what will work best for your solution.

Programmatically find common European street names

I am in the middle of designing a web form for German and French users. Within this form, the users would have to type street names several times.
I want to minimize the annoyance to the user, and offer autocomplete feature based on common French and German street names.
Any idea where I can a royalty-free list?
Would your users have to type the same street name multiple times? Because you could easily prevent this by coding something that prefilled the fields.
Another option could be to use your user database as a resource. Query it for all the available street names entered by your existing users and use that to generate suggestions.
Of course this would only work if you have a considerable number of users.
[EDIT] You could have a look at OpenStreetMap with their Planet.osm dumbs (or have a look here for a dump containing data for just Europe). That is basically the OSM database with all the map information they have, including street names. It's all in an XML format and streets seem to be stored as Ways. There are tools (i.e. Osmosis) to extract the data and put it into a database, or you could write something to plough through the data and filter out the street names for your database.
Start with http://en.wikipedia.org/wiki/Category:Streets_in_Germany and http://en.wikipedia.org/wiki/Category:Streets_in_France. You may want to verify the Wikipedia copyright isn't more protective than would be suitable for your needs.
Edit (merged from my own comment): Of course, to answer the "programmatically" part of your question: figure out how to spider and scrape those Wikipedia category pages. The polite thing to do would be to cache it, rather than hitting it every time you need to get the street list; refreshing once every month or so should be sufficient, since the information is unlikely to change significantly.
You could start by pulling names via Google API (just find e.g. lat/long outer bounds - of Paris and go to the center) - but since Google limits API use, it would probably take very long to do it.
I had once contacted City of Bratislava about the street names list and they sent it to me as XLS. Maybe you could try doing that for your preferred cities.
I like Tom van Enckevort's suggestion, but I would be a little more specific that just looking inside the Planet.osm links, because most of them require the usage of some tool to deal with the supported formats (pbf, osm xml etc)
In fact, take a look at the following link
http://download.gisgraphy.com/openstreetmap/
The files there are all in .txt format and if it's only the street names that you want to use, just extract the second field (name) and you are done.
As an fyi, I didn't have any use for the French files in my project, but mining the German files resulted (after normalization) in a little more than 380K unique entries (~6 MB in size)
#dusoft might be onto something - maybe someone at a government level can help? I don't think that a simple list of street names cannot be copyrighted, nor any royalties be charged. If that is the case, maybe you could even scrape some mapping data from something like a TomTom?
The "Deutsche Post" offers a list with all street names in Germany:
http://www.deutschepost.de/dpag?xmlFile=link1015590_3877
They don't mention the price, but I reckon it's not for free.

Resources