Is it possible to modify spaCy by udpipe within the Rasa-NLU? - rasa-nlu

I am several days testing Rasa-NLU, which internally uses spaCy. I had a great disappointment about the Portuguese language. Trying to figure out how to improve the training data, I found an excellent script comparing spaCy with udpipe that can be checked in this link and also in the image below.
I would like to know if I can continue using Rasa-NLU, but replace the engine spaCy for udpipe?

Since Rasa is open-source it's very easy customizable. In your case you could build your own NLU component as it's described in this blog post.

Related

Is there a list about the supported ML-Algorithms in TensorFlow?

I am writing my bachelor thesis on "Machine Learning in Java" and compare frameworks and libraries. Currently I am collecting information about the different machine learning algorithms supported by the framework. I write them down in Excel, and use this data to evaluate the individual frameworks. Now to my problem:
I can't find a list of supported ML algorithms in TensorFlow (Java-API). Is there a quick overview that I can use for this? I have already googled desperately, no success so far. Could someone please help me, that would be great.
The list of tensorflow optimizers can be found here:
https://www.tensorflow.org/api_docs/python/tf/keras/optimizers

Can I generate images in Clojure and show them inline in LightTable REPL?

I want to write some Clojure code which would manipulate images, and I wonder if there's a way to show them in the LightTable REPL inline?
As stated by #juan.facorro it should be as easy as encoding a certain image with base64 information. This is definitely possible and it is already in use in the python plugin. They use it together with matplotlib and iPython to create inline plots which is really cool.
Sadly this is not (yet) implemented in lighttable's Clojure plugin. I have been discussing with the lighttable team to implement this. You can check it here. For the moment I don't have much time so if you want to contribute (either to the discussion or programming it yourself) you are much welcome

How to start learning hadoop [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
I am a Web developer. I have experience in Web technologies like JavaScript , Jquery , Php , HTML . I know basic concepts of C. Recently I had taken interest in learning more about mapreduce and hadoop. So I enrolled my self in parallel data processing in mapreduce course in my university. Since I dont have any prior programing knowledge in any object oriented languages like Java or C++ , how should I go about learning map reduce and hadoop. I have started to read Yahoo hadoop tutorials and also OReilly's Hadoop The Definitive Guide 2nd.Edition.
I would like you guys to suggest me ways I could go about learning mapreduce and hadoop.
Here are some nice YouTube videos on MapReduce
http://www.youtube.com/watch?v=yjPBkvYh-ss
http://www.youtube.com/watch?v=-vD6PUdf3Js
http://www.youtube.com/watch?v=5Eib_H_zCEY
http://www.youtube.com/watch?v=1ZDybXl212Q
http://www.youtube.com/watch?v=BT-piFBP4fE
Also, here are nice tutorials on how to setup Hadoop on Ubuntu
http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-node-cluster/
http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-multi-node-cluster/
You can access Hadoop from many different languages and a number of resources set up Hadoop for you. You could try Amazon's Elastic MapReduce (EMR), for instance, without having to go through the hassle of configuring the servers, workers, etc. This is a good way to get your head around MapReduce processing while delaying a bit the issues of learning how to use HDFS well, how to manage your scheduler, etc.
It's not hard to search for your favorite language & find Hadoop APIs for it or at least some tutorials on linking it with Hadoop. For instance, here's a walkthrough on a PHP app run on Hadoop: http://www.lunchpauze.com/2007/10/writing-hadoop-mapreduce-program-in-php.html
Answer 1 :
It is very desirable to know Java. Hadoop is written in Java. Its popular Sequence File format is dependent on Java.
Even if you use Hive or Pig, you'll probably need to write your own UDF someday. Some people still try to write them in other languages, but I guess that Java has more robust and primary support for them.
Most Hadoop tools are not mature enough (like Sqoop, HCatalog and so on), so you'll see many Java error stack traces and probably you'll want to hack the source code someday
Answer 2
It is not required for you to know Java.
As the others said, it would be very helpful depending on how complex your processing may be. However, there is an incredible amount you can do with just Pig and say Hive.
I would agree that it is fairly likely you will eventually need to write a user defined function (UDF), however, I've written those in Python, and it is very easy to write UDFs in Python.
Granted, if you have very stringent performance requirements, then a Java based MapReduce program would be the way to go. However, great advancements in performance are being made all of the time in both Pig and Hive.
So, the short answer to your question is, "No", it is not required for you to know Java in order to perform Hadoop development.
Source :
http://www.linkedin.com/groups/Is-it-must-Hadoop-Developer-988957.S.141072851
1) Learn Java. No way around that, sorry.
2) Profit! It'll be very easy after that -- Hadoop is pretty darn simple.
It sounds like you are on the right track. I recommend setting up some Virtual Machines on your home computer to start taking what you see in the books and implementing them in your VMs. As with many things the only way to become better at something is to practice it. Once you get into I am sure you will have enough knowledge to start a small project to implement Hadoop with. Here are some examples of things people have built with Hadoop: Powered by Hadoop
Go through the Yahoo Hadoop tutorial before going through Hadoop the definitive guide. The Yahoo tutorial gives you a very clean and easy understanding of the architecture.
I think the concepts are not arranged properly in the Book. That makes it a little difficult to study it.
So do not study it together. Go through the web tutorial first.
I just put together a paper on this topic. Great resources above, but I think you'll find some additional pointers here: http://images.globalknowledge.com/wwwimages/whitepaperpdf/WP_CL_Learning_Hadoop.pdf
Feel free to join my blog about Big Data - https://oyermolenko.blog. I’ve been working with Hadoop for a couple of years and in this blog want to share my experience from the early start. I came from .NET environment and faced a couple of challenges related to switching from one language into another. My blog is oriented on people who didn’t work with Hadoop but have some primary technical background like you. Step by step I want to cover the whole family of Big Data services, describe the concepts and common problems I met working with them. Hope you will enjoy it

Scala, Swing and MVC

I found some informations about using Swing with Scala in Programming in Scala book. I also found some basic informations here: http://www.scala-lang.org/sid/8 . But there are no informations about how to build bigger application based on Swing. Than I found some informations in German (but code is in Scala;)): http://www.scalatutorial.de/topic123.html . And it is good... but still, it is good for small applications. I'm going to wrote something bigger. So I want to make not one, but few models, and few views for each, like in Ruby on Rails. Do you know some good tutorials or egsamples which may help me do it properly?
here's a program using scala-swing: https://github.com/lrytz/pacman, also rather small though.

What is the most functional and ready-to-use SWT API in Scala?

Is there a Scala SWT wrapper/API that has the most features and is most ready to be used? I see a couple what appear to be informal wrappers but can't tell if they're maintained. Also, one or two I see on multiple places and I'm not sure where the canonical place for them.
I've been using Dave Orme's XScalaWT with my own additions for JFace viewers for a while, and I find it great. Be sure to read the very interesting introductory blog post.
I'm also aware of SSWT but have no experience using it. I believe XSWT is more mature.

Resources