Ruby: Mailing List library or gem - ruby

Can anyone recommend a good gem or library for managing a mailing list with Ruby? No Rails solutions, if possible, please (I don't want to have ActionWhatever dependencies, this will most likely be done with Ramaze).
I just need basic features, like management of the list itself (CRUD operations on the user list), plus being able to send notifications, welcome messages, and also auto respond to basic things like subscribe and unsubscribe.
Optimally, people should be able to subscribe via both a Ramaze web page (i.e. I'd have Ramaze call/access the lib's API), as well as by sending an email to a specific email address. But I am willing to forego the operations by email.
I'm willing to entertain non-Ruby, or non-programmatic solutions, if they are good, but the ability to subscribe from a web page [under my control] is a must.
EDIT: Sorry, one important detail I forgot to add: This is intended to be a one-way mailing list. That is, people should be able to subscribe and unsubscribe alright, but only one person should be allowed to send to the list for broadcasting.

I'm not exactly sure about your requirements. If you only need basic features, why do you care what language it is implemented in? You would only need to know this if you need advanced features that you have to implement yourself!
Given your requirements, pretty much any mailing list server will fit the bill.
However, there is a specific suggestion I would like to make, just because I think it is an extremely cool example of a refreshing take on e-mail applications: Lamson.
Lamson is not a mailing list server, rather it is an e-mail application development framework (similar to how Rails is a web application development framework). Lamson is not written in Ruby, but in Python, but it is quite simply the best thing that has happened to e-mail since, well, ever. It was written by Zed Shaw (of Mongrel fame), and is based on the premise that just like Rails proved that web development doesn't have to be a PITA, e-mail development doesn't have to be, either. (In that way it is similar to Adhearsion, which also took the ideas of Rails and applied to a totally different domain, in this case telephony.)
There is already a mailing list service based on Lamson, called Libre List, which (naturally) hosts the Lamson mailing lists, among others. The source code to Libre List is included in the Lamson source distribution as an example.

I ended up going with Google Groups. (If silky would care to add an official answer to this effect, I would gladly mark it as the official accepted answer.)
Google Groups lets you alter settings so that you can have a "newsletter" like I wanted (i.e. single sender, multiple recipients). It also has an embeddable HTML snippet ready to go for quick subscription from a web page under your control.

Related

How to use Twilio to do some automation, and then let a human converse?

I'm a 100% newbie with Twilio but trying to help someone out.
We have a website where someone fills out a form and it kicks off some automated texts. First we want some automated back and forth (at the moment this is being done by our website built on bubble.io, but we could switch it to Twilio if need be).
At the end of the automated conversation, we want a human to then step in and have a human conversation.
Is this possible? How would I do that?
We're open to any platform.
take a look at this part of the Twilio documentation, it may provide additional insight into what you are attempting. Studio is good for the initial human<>bot interaction but at some point for 2-way dialog, you will need to introduce Programmable Chat.
How to hand-off messaging conversations from Autopilot to your Contact Center
https://www.twilio.com/docs/autopilot/guides/how-to-hand-off
To achieve your goal try following five steps:
Hand-off documentation
send-to-agent-action
send-to-agent-function
Build Agent Hand-off
Complete the hand-off
To sum up, you need to go through several steps. First of all, visit 1 this link (although there might be need of this link to do miscellaneous tasks until to achieve your final output). After going to 1 this link follow the steps (side navigation menu) as per as your need. The documentation is well directed. Then follow the rest links i have mentioned (go one by one from 2 to 5. These are the things you have to go through each of them to achieve your goal).
That's all. Hopefully, it'll be helpful for you :)
This question is too open-ended to really answer, but sounds like a good use case for Twilio Studio, which is a GUI interface to the Twilio API and has widgets to gather text input, make http requests to send that input wherever you need, and connect the user to an agent afterwards.

Website creation query

I need to create a website which stores the list of all games the player has played and it shows right on your profile. As the player goes on completing a game, he adds the game into his list.
So i would need a basic lo-gin configuration and then by using AJAX, I will populate the list of games which he wants to add to his list. So that he can track the list with games that he has played.
So now I need suggestion on how to go on with it?
How to start building?
Which language do I need to pickup?
I am well versed with Java and j2ee.
Is this enough?
Also I am a freelancer so I can't afford to pay for a website. So any free website hosting service which will help me to build the website which I have in mind??
Also if I use any free website hosting service, will they provide me with a database and AJAX capabilities?
Here's the basic setup:
You need a domain first. Try to pick something unique, as it will be cheaper. You can find one on namecheap: https://www.namecheap.com
You need hosting. Again, go with namecheap.
To start building, you need to learn some HTML and CSS. HTML is markup of the web, and CSS is the stylesheet of the web. They aren't hard languages to start off in. You can start for free at Khan Academy: https://www.khanacademy.org/computing/computer-programming/html-css
I believe namecheap offers database support as well. Ajax isn't provided by a hosting service. It's more of a group of languages (HTML, CSS, JavaScript).
This should get you going. I can't really give you more detailed information than this because your question is really broad. If you Google your questions, you'll get good answers and guides.
Best of luck.

What are some good Ruby-based web crawlers? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 8 years ago.
Improve this question
I am looking at writing my own, but I am wondering if there are any good web crawlers out there which are written in Ruby.
Short of a full-blown web crawler, any gems that might be helpful in building a web crawler would be useful. I know this part of the question is touched upon in a couple of places, but a list of gems applicable to building a web crawler would be a great resource as well.
I used to write spiders, page scrapers and site analyzers for my job, and still write them periodically to scratch some itch I get.
Ruby has some excellent gems to make it easy:
Nokogiri is my #1 choice for the HTML parser. I used to use Hpricot, but found some sites that made it explode in flames. I switched to Nokogiri afterwards and have been very happy with it. I regularly use it for parsing HTML, RDF/RSS/Atom and XML. Ox looks interesting too, so that might be another candidate, though I find searching the DOM a lot easier than trying to walk through a big hash, such as what is returned by Ox.
OpenURI is good as a simple HTTP client, but it can get in the way when you want to do more complex things or need to have multiple requests firing at once. I'd recommend looking at HTTPClient or Typhoeus with Hydra for modest to heavyweight jobs. Curb is good too, because it uses the cURL library, but the interface isn't as intuitive to me. It's worth looking at though. HTTPclient is also worth looking at, but I lean toward the previously mentioned ones.
Note: OpenURI has some flaws and vulnerabilities that can affect unsuspecting programmers so it's fallen out of favor somewhat. RestClient is a very worthy successor.
You'll need a backing database, and some way to talk to it. This isn't a task for Rails per se, but you could use ActiveRecord, detached from Rails, to talk to the database. I've done that a couple times and it works all right. Instead, I really like Sequel for my ORM. It's very flexible in how it lets you talk to the database, from using straight SQL to using Sequel's ability to programmatically build a query, to modeling the database and using migrations. Once you have the database built, you could use Rails to act as a front-end to the data though.
If you are going to navigate sites in any way beyond simply grabbing pages and following links, you'll want to look at Mechanize. It makes it easy to fill out forms and submit pages. As an added bonus, you can grab the content of a page as a Nokogiri HTML document and parse away using Nokogiri's multitude of tricks.
For massaging/mangling URLs I really like Addressable::URI. It's more full-featured than the built-in URI module. One thing that URI does that's nice is it has the URI#extract method to scan a string for URLs. If that string happened to be the body of a web page it would be an alternate way of locating links, but its downside is you'll also get links to images, videos, ads, etc., and you'll have to filter those out, probably resulting in more work than if you use a parser and look for <a> tags exclusively. For that matter, Mechanize also has the links method which returns all the links in a page, but you'll still have to filter them to determine whether you want to follow or ignore them.
If you think you'll need to deal with Javascript manipulated pages, or pages that get their content dynamically from AJAX, you should look into using one of the WATIR variants. There are flavors for the different browsers on different OSes, such as Firewatir, Safariwatir and Operawatir, so you'll have to figure out what works for you.
You do NOT want to rely on keeping your list of URLs to visit, or visited URLs, in memory. Design a database schema and store that information there. Spend some time up front designing the schema, thinking about what things you'll want to know as you collect links on a site. SQLite3, MySQL and Postgres are all excellent choices, depending on how big you think your database needs will be. One of my site analyzers was custom designed to help us recommend SEO changes for a Fortune 50 company. It ran for over three weeks covering about twenty different sites before we had enough data and stopped it. Imagine what would have happened if we had a power-outage and all that data went in the bit-bucket.
After all that you'll want to also make your code be aware of proper spidering etiquette: What are the key considerations when creating a web crawler?
I am building wombat, a Ruby DSL to crawl web pages and extract content. Check it out on github https://github.com/felipecsl/wombat
It is still in an early stage but is already functional with basic functionality. More stuff will be added really soon.
So you want a good Ruby-based web crawler?
Try spider or anemone. Both have solid usage according to RubyGems download counts.
The other answers, so far, are detailed and helpful but they don't have a laser-like focus on the question, which asks for ruby libraries for web crawlers. It would seem that this distinction can get muddled: see my answer to "Crawling vs. Web-Scraping?"
Tin Man's comprehensive list is good but partly outdated for me.
Most websites my customers deal with are heavily AJAX/Javascript dependent.
I've been using Watir / watir-webdriver / selenium for a few years too, but the overhead of having to load up a hidden web browser on the backend to render that DOM stuff just isn't viable, let alone that all this time they still haven't implemented a useable "browser session reuse" to let new code execution reuse an old browser in memory for this purpose, shooting down tickets that might have worked their way up the API layers eventually. (refering to https://code.google.com/p/selenium/issues/detail?id=18 ) **
https://rubygems.org/gems/phantomjs
is what we're migrating new projects over to now, to let the necessary data get rendered without even any sort of invisible Xvfb memory & CPU heavy web browser.
** Alternative approaches also failed to pan out:
how to serialize an object using TCPServer inside?
Can a watir browser object be re-used in a later Ruby process?
If you don't want to write your own, then use any ordinary web crawler. There are dozens out there.
If you do want to write your own, then write your own. A web crawler isn't exactly a complicated activity, it consists of:
Downloading a website.
Locating URLs in that website, filtered however you dang well please.
For each URL in that website, repeat step 1.
Oh, and this seems to be a duplicate of "Web crawler in ruby".

Ideas to extend this little project? - A pidgin web ui

I have built a little Web UI for Pidgin(respectively all libpurple based messengers) together with DBus and Sinatra.
It was for fun and learning purposes and now I'm looking for ideas to extend it.
Can you think of any useful applications or extensions for it?
Since I work on this project to learn something new, ideas for other technologies to be used/combined are welcome.
Finally here is the link: pidgin-web-ui
I few things that that might use to many many people would be:
good and simple to configure https support, so that users in "monitored" countries to be able to still chat freely (if the server is somewhere else).
Unified Message Archive . Many IM clients have various archive functions, but are different, limited, hard to search, and many are "client only", so not accessible when one needs them the most. Since Pidgin can connect to so many IM networks, it would be cool to have such a "global message hub archive". This would ensure that everything the user is talking is archived (very useful for businesses too), easy to search, available on a server (so always at hand).
File Archive on the server. The same as the Unified Message Archive, but for the files/images users exchange. Having them on the server (with a hash for easy sync) as a backup and archive would greatly reduce the traffic if they need to be shared more than once.
The would be many more nice features, that would help many users, but the above 3 seem to miss from usual IM software.
My idea after a brainstorming minute:
Dropbot
Create a messaging account anywhere and add this account as a contact to your messenger. This contact is your Dropbot.
Change your interpreter UI so it does not display a conversation but a log. In this way you can just drop things to the contact like interesting links. There could be a Dropbot for a read later queue, your favorite citations or for a list of funny findings.
You could then extend your UI to a little mashup. It could follow the links and grap the title of the page and a content preview just as Facebook does it when posting a link to your wall.
You could further extend your app by adding post-drop behavior to the Dropbot.
Dropbot could post your link (probably with a message) on Twitter or Facebook.
Dropbot could automatically distribute the link to the other contacts of it (like your friends)
Ok, that sounds fine... but you could do that without a message bot inbetween. What's the deal?
For me the advantage would be that my IM is always open and it would be fairly easy to drop a link. You could do the link dropping with Delicious or post stuff to a Google Wave, yeah. But I don't like to go to a web page, log in and organize stuff in the UI. Actually I stumble upon those links when I should do more important stuff instead. So just dropping it to my IM Dropbot contact would be cool.
Why not extend it to cover all the basic features of instant messaging (sending/receiving messages, adding contacts, etc...)? Seeing how many features you can reproduce may be a fun exercise. Create your own little Meebo...
Want to have fun?
Make a Markov-chained-based chatbot integrated into the web app. Make it use scraped web search results for the content, after searching for terms parsed out of the human's responses. That should be fun, and will give you funny, and sometimes eerily smart-looking results. Have fun!
I have seen your code. Why not split dbus_thread into a event_machine daemon for further scalability?
Integrate it with Twitter. Trace conversations (#Replies), including multi-party involvement. Log them. And so on.
Many interesting features and a popular, original API to learn.

GWT, with multiple clients

I am designing a web application using GWT currently, which is also the first time i am using GWT. I just have a general question about how (or can) GWT handles communications between multilpe clients.
My application needs user to login and has personalized pages for different users, GWT is well able to do all of these. The only problem is user needs to know what other users are doing, a simple example is like Google Talk, when one user is "typing", the other side will be noticed. So i am just wondering if GWT can do this?
As i said this is my first time using GWT, so, if GWT is well able to provide the these user interacting functions, i will go with GWT, otherwise i can make changes when it is not too late.
Thanks!!!
Looking at the example you gave, if user A starts typing, there's no problem sending the "started typing" event to the server. The server would than have to look up who user A is talking with (say, user B), and get the information to B's browser. This is, of course, the trickier part, but there is more than one way to perform the task, as described for instance here.
In summary, if you're OK with passing the requests through the server, I don't see a problem with using GWT as the underlying technology.
What you need is server push/ajax push/comet/many other names. I've summarized the options you have for GWT in a different answer.
For a quick start, check out NGiNX_HTTP_Push_Module - IMHO it's the easiest one to customize to your needs and they provide a nice chat example that should get you started. However, if you also use jQuery or Mootools in your application (for example, for UI effects), you might want to also consider Ajax Push Engine/APE-Project (but remember that jQuery/Mootools might require some tweaking to work with GWT). Those two are my favorites :)

Resources