I'm not really sure how to search this over the internet, I tried some searches but never got the help I needed, so I'll just ask here. (sry if it's already answered!)
I'm building a embedded system that runs on windows. I'll gather some data and send over the internet to read at home. I'm most probably using a 3G connection to connect my system (that will keep moving) to the internet and send the data over. I planned to use a ftp server with a hamachi connection to send the files to another pc. And it'll be automated, so the only person action will be to read the data at home. I tested and this all works fine when I use a reliable connection, like when I'm home.
My question is, will it work when my 3G connection drops and how can I make this system reliable?
I want to keep storing the data if the connection is down and send it all when it's back online, but i don't know if the system will automatically connect (i can't have a person manually clicking 'connect') to hamachi or to the ftp server (my first time using this technologies).
Also, is there a better, more reliable or simpler way than hamachi+ftp to send the data?
Thx,
EDIT: Adding more info. I'm gathering data with a LabView VI. The plan was to save this data into a file (txt, csv or whatever), send the file over and have another VI reading the file and display some graphs and etc. There is a DataSocket in Labview to send some data over the internet, but I'm not familiar with these internet protocols, it says I can use FTP, HTTP and others. What is paid and what can i do for free?
Also, is there a better, more reliable or simpler way than hamachi+ftp to send the data?
Might it be simpler to use e-mail (SMTP)? That has the advantage that the sender and receiver need not be up at the same time.
Related
I want to make my own ftp server software. I know, there are many ftp servers ready to install that have many features, but I still want to make my own, cause I can customize it and make it the way I want. Also, I find it fun to code, and I would rather make my own FTP server instead of downloading one that is ready to use if I have time. The problem is I can't find any information on how the protocol works and stuff like that. I would appreciate it if someone could explain how the protocol works or at least send me to a page that has useful information. Thanks!
If you are curious about beeing downvoted, here the explanation:
When planning a homebrew ftp program, it is a good point to google about FTP. Soon you will find RFC`S (that is "Request For Comment"), where things like the FTP protocol are described.
Thats a good starting point.
Then, when you have troubles with specific points, come back, show your code here, and ask for help.
Rather a hard to nail down question, but basically I'm wondering what the best way (and not "what's your opinion" but "which will most adequately meet the requirement i shall set forth) is to open a stream connection from a client side webpage to a server such that either can send data to the other without polling? I'm thinking the term for this is HTTP binding vs. HTTP Polling. The context here is a chat application - i'd like a streamed connection so that the browser isn't constantly pushing requests out. The client end here is KnockoutJS and jQuery. I'd like to be able to have the data pushed back and forth be JSON (or at least manipulatable by jQuery and Knockout's toJSON). The server end - not quite sure what it is going to be, but i'll probably be running on a linux server, so anything compatible with that works fine.
If there's any more details i can provide, just let me know - i'm sure i left some obvious detail out. Also, i'm aware there's probably a duplicate question on this, so if your answer is as good as closing for a dupe and putting in a link, that's great.
Thanks!
I think what you're looking for is referred to as Comet. The basic idea is to keep HTTP requests open for longer periods of time so that the server can send data to the client as it comes in, rather than the client having to continually poll the server for new data. There are multiple ways to implement it. This Wikipedia article is a good start for more info.
This MIX 2011 video discusses the long polling technique (although the suggestion in the video is that web sockets will be a better solution with future browsers).
I'm trying to move a large amount of files from one CDN to another. I know that they have a very high-speed connection to each other, so what I'd like to do is connect them directly, but the only protocol I have access to with each is FTP. Is there any way to log into the current CDN and send their files to the other FTP? It seems like it should be possible, I just have no idea how to do it.
FXP: see Wikipedia article. Powerful FTP clients are capable of doing this. From protocol point of view it's trivial.
BTW this question is probably offtopic here.
I'm maintaining a legacy system (pre .NET). Is there any way to fetch the time from an NTP server using the Windows API? Failing that, I could probably create a COM object with .NET to do it, but I would rather not go to that effort.
Yes, it's certainly possible. Unless you're quite ambitious, it's probably easiest to start from working source code. I'd go for Terje Mathisen's port, but that's probably because I've known Terje via Usenet (comp.lang.asm.x86) for years -- the other ports are probably perfectly good too.
The best source that I found was http://www.dataman.ro/?page_id=39.
However, I'm leaning toward using the time from a server on the local LAN/WAN, via NetRemoteTOD. Windows is supposed to keep the date synchronized from an NTP server automatically, as long as its configured properly on their local time server machine. Then if their Internet access goes down, I'll still be able to fetch the "standard" time.
I'm interested in how you would approach implementing a BitTorrent-like social network. It might have a central server, but it must be able to run in a peer-to-peer manner, without communication to it:
If a whole region's network is disconnected from the internet, it should be able to pass updates from users inside the region to each other
However, if some computer gets the posts from the central server, it should be able to pass them around.
There is some reasonable level of identification; some computers might be dissipating incomplete/incorrect posts or performing DOS attacks. It should be able to describe some information as coming from more trusted computers and some from less trusted.
It should be able to theoretically use any computer as a server, however, optimizing dynamically the network so that typically only fast computers with ample internet work as seeders.
The network should be able to scale to hundreds of millions of users; however, each particular person is interested in less than a thousand feeds.
It should include some Tor-like privacy features.
Purely theoretical question, though inspired by recent events :) I do hope somebody implements it.
Interesting question. With the use of already existing tor, p2p, darknet features and by using some public/private key infrastructure, you possibly could come up with some great things. It would be nice to see something like this in action. However I see a major problem. Not by some people using it for file sharing, BUT by flooding the network with useless information. I therefore would suggest using a twitter like approach where you can ban and subscribe to certain people and start with a very reduced set of functions at the beginning.
Incidentally we programmers could make a good start to accomplish that goal by NOT saving and analyzing to much information about the users and use safe ways for storing and accessing user related data!
Interesting, the rendezvous protocol does something similar to this (it grabs "buddies" in the local network)
Bittorrent is a mean of transfering static information, its not intended to have everyone become producers of new content. Also, bittorrent requires that the producer is a dedicated server until all of the clients are able to grab the information.
Diaspora claims to be such one thing.