How can I use openstreet map to order waypoints? - ruby

I saw in articles that OpenStreet map provides an API that, given a route with an origin and destination and multiple (unlimited?) waypoints it sorts the waypoints according to the best route.
I couldn't tell which endpoint it was. Could someone point me to the part of the documentation that explains how to achieve this? Is there a ruby gem that wraps up this endpoint request?
Thank you very much

This is the traveling salesman problem. There is more than one OSM-based router for solving this problem. According to a similar question at help.openstreetmap.org:
All major OSM routing engines support this:
Mapzen's Valhalla
("Valhalla also includes tools like time+distance matrix computation,
isochrones, elevation sampling, map matching and tour optimization
(Travelling Salesman)."),
Mapbox's
OSRM ("The
trip plugin solves the Traveling Salesman Problem using a greedy
heuristic...")
Graphhopper uses the JSPrit
library
for route optimization ("TSP problem can be modelled by defining a
vehicle routing problem...")
None of these services have a free and unlimited online offering (it
would quickly be abused by people trying to save on their own AWS
cost). Mapzen has an offer where you register a free API key and use
that. OSRM doesn't need an API key, you can just use it. Graphhopper
requires registration and while they have a free trial, I don't think
they have a free tier.
All three are Open Source and you can install and use them without
limits locally.
For GraphHopper take a look at the Route Optimization API. For OSRM see the trip plugin.

Set up your own OpenStreetMap server - this way you won't be incurring data access fees every time your app needs to run maps queries.
Specifically, install the Valhalla maps server. It's a free application. Best to install it on a Linux box:
https://github.com/valhalla/valhalla
Or download and run the docker image in Docker instead:
https://hub.docker.com/r/abihf/valhalla/
https://github.com/interline-io/valhalla-docker
The server provides an API specifically for ordering waypoints:
https://valhalla.readthedocs.io/en/latest/api/optimized/api-reference/

Related

How to get the number of confirmations of a TX on a different network using Chainlink?

What would be the most reliable way of checking if a given TX on different network has been confirmed k times using Chainlink?
I know I can make an API call to Etherscan, for instance, but since this is a common use-case I wonder if there are more reliable well-known methods for doing it.
Chainlink itself advertises Cross-Chain solutions (https://chain.link/solutions/cross-chain) but I could not find any technical documentation about those. Pointers are welcome.
Chainlink CCIP (Cross-Chain Interoperability Protocol) is still in development.
In the meantime, you can make an API call to another blockchain with the Chainlink API feature. You'll have to do some work to make sure that you have enough nodes making the API calls to make sure it's decentralized, but that's essentially it!

the geo coder to fetch more requests

I am working with geocoder gem and like to process more number of requests from an IP. By default Google API provides only 2500 requests per day.
Please share your thoughts on how I can do more requests than the limit?
As stated before: Using only Google API the only way around the limitation is to pay for it. Or in a more shady way make the requests form more than one IP/API-Key which i would not recommend.
But to stay on the save side i would suggest mixing the services up since there a few more Geocoding APIs out there - for free.
With the right gem mixing them is also not a big issue:
http://www.rubygeocoder.com/
Supports a couple of them with a nice interface. You would pretty much only have to add some rate-limiting counters making sure you stay within the limits of each provider.
Or go the heavy way of implementing your own geocoding. With for example your own running Openstreetmaps database. The Database can be downloaded here: http://wiki.openstreetmap.org/wiki/Planet.osm#Worldwide_data
Which is the best way depends on what your actual requirements are and what ressources you have available.

Are there performance issues of being a client of your own API?

Take Twitter for example, they say twitter.com as a client of their own API. Could this be one of the reason why Twitter is quite 'slow'?
Reference: http://engineering.twitter.com/2010/09/tech-behind-new-twittercom.html
Would you recommend using your own API for you main website/app?
If using own API is OK, what are the ways to avoid performance issues?
Regarding using your own API: It's about trade offs. In the twitter example by using their own API they were able to "allocate more resources to the API team." That benefit for them outweighed a performance hit. There are other benefits not mentioned either, Like, being the first to vet your api and having a single unified entry point into the system. There are drawbacks as well that are mentioned in the link you posted.
For your application you should look at the architectural qualities you want to achieve and balance that with the constraints you are given and make your own choice. If ultra high performance is at the top of the list then craft your solution to meet that goal.
Regarding performance when using your own API: Again it depends. In the twitter case they knew they would be accessing the API in JavaScript. So the physical jumps are Browser --> Server --> DB. There is no way to get around these hops if you are doing client-server development. In the link you posted they talked about going directly to the DB. Yes that would be faster, but I'm not sure how to do that from a javascript client. I suppose if they had used websockets to a custom API then that would have been faster, but at what development cost.
Summary So it's not that they are using their own API that was the performance hit, it was that they wanted the client to be an HTTP hop away.
Please note that none of these comments talk about what the server --> db calls look like or their caching strategy, or any of the other dozen things which could be a bottleneck

Web Hosting, Web Scaling

I have a simple web application to conduct online exams for the college students. All questions are multiple choice questions. Around 5000 users will be taking up the exam. My backend is mysql and using PHP as the front end. I want to know the hardware configuration for the servers that will be required to host this application and work seamlessly for the required no of users.
I am also looking out for cloud solutions. If I choose Amazone EC2 instances, can some body give me advice on what type of EC2 machine I should go into for this application?
It is impossible to tell the exact specs of the servers that will be required to run your setup, because there are too many variables. However, it is definitely a good question: when I was a student at university, it happened that a professor tried to do this, and didn't do testing: on the exam date, the system got overloaded and the exam had to be cancelled!
Start with testing what you already have. You can use something like the ab tool or JMeter. It will simulate the requested load for you automatically, so you can check how your actual server performs, and act accordingly.
Application design is also important. Like you can cache all the question at web layer to avoid database query. Make client heavy app such that server payload is minimum (json response) to reduce download time load on server.
Request multiple questions at once and Batch user responses to answer question together to decrease ajax calls.
Make use of nosql solution to avoid RDMS constraints overhead.

Google Visualization API

I want a real and honest opinion what do you think of Google Visualization API?
Is it reliable to use becasue when i was reading the documentation i noticed that there are alot of issues and defects to overcome and can i use it to retrieve data from mysql database.
Thank you.
I am currently evaluating it. As compared to other javascript data visualization frameworks, i think it has a lot going for it:
dynamic loading is built-in
diverse, many things to choose from.
looks really great!
framework mostly takes care of picking whatever implementation fits the current browser
service based, you don't need to download anything in advance
unified data source: just create one data table, and have multiple visalizations draw from that data.
As a disadvantage, I'd like to mention security. I mean, because it's all service based, it is not so transparent what happens when you pass data into these API calls. And as far as I know, the API is free, but not open source, so I can't really check what is going on behind the covers.
I think the Google visualization API really shines if you want to very quickly whip up a visualization gadget for use in a blog or so, and you are not interested in deploying all kinds of plugins and libraries (for eaxmple, with jQuery based frameworks, you need may need to manage multitple javascript libraries that work together to deliver the goods). If on the other hand you are creating an application that you want to sell, you might want to keep more control over what components you are using, and I would probably consider using something like Flot
But like I said, I am only evaluation atm, I am not using this in production.
Works really great for me. Can be customized fairly easily. Haven't seen any scaling issues. No data is exposed so security should not be an issue. - Arunabh Das
One point I want to add here is that, Google Visualization API cannot be downloaded, its not available for offline usage. So application which is going to use it must be always connected to internet, otherwise I think it wont be able to render charts. Due
to this limitation, this API cannot be used in some applications for which internet connection is not available.
I am currently working on a web based application that will have the Google Visualization API added to it and from the perspective of a developer the Google Visualization API is very limited in what you can do with each individual Chart and if I had a choice I would probably look at dojox charting just because of the extra flexibility that the framework gives you.
If you are doing any kind of large web application that will use charting extensively then I would not recommend the Google Visualizations API it does not have enough flexibility for a large web application.
I am using Google Visualization API and I want to stress that they still won't let you download it, which means if their servers are down, your app will be down if you depend on it. I have been using it for about 4 months, and they have crashed once me once so I'd say they pretty reliable and their documentation is really nice.

Resources