Does Stanford Core NLP support lemmatization for German? - stanford-nlp

I found German parse and pos-tag models which are compatible with Stanford Core NLP. However I was not able to get German lemmatization working. Is there a way to do so?

Sorry, as far as I know no implementation of German lemmatization exists for Stanford CoreNLP.

Even the latest version 4.4.0 of CoreNLP still does not support lemmatization for German. See https://stanfordnlp.github.io/CoreNLP/human-languages.html for reference.

Since the version 3.6 is also German supported.
Check it under http://stanfordnlp.github.io/CoreNLP/history.html

Related

Spring Framework 5 Reference Documentation epub

For Spring versions 5.x I cannot find any epub or pdf version of the reference documentation. Former versions were available e.g. at https://docs.spring.io/spring/docs/4.3.9.RELEASE/spring-framework-reference/epub/. Are they available any more? epubs are perfect to be read with an ebook reader.
My guess is that it will get published in time.
Meanwhile - this is the 5.0.4.RELEASE docs in pdf:
https://docs.spring.io/spring/docs/current/spring-framework-reference/pdf/
which you can convert to epub if you wish to do so.
The only officially available and latest epub is the 5.0.0.M5: https://docs.spring.io/autorepo/docs/spring-framework/5.0.0.M5/spring-framework-reference/epub/
The epub for the latest version is not available.
but, I would recommend you to try to convert the html documentation directly to epub.
as converting pdf to epub does not result in very good result [as tools like calibre convert the pdf to html and then epub]
I tried a online converter to convert the https://docs.spring.io/spring/docs/current/spring-framework-reference/
and works pretty good.. [especially the code blocks seem better than the official epub]
hope you find this helpful.

how to use elastic plugins with jhipster?

I am using jhipster with elastic search.
It works quite well with English word search. But for Chinese and Japanese, it splits the word into signal characters.
After some research, I understood this is an elasticsearch issue, and can be solved by plug-in, such as elasticsearch-analysis-kuromoji and analysis-smartcn
Official site of elastic-search only tells me how to install the plugins with "bin/plugin install ..." I don't think I can run just like this.
It seems, jhipster is using spring-data-elasticsearch to realize the implementation, but I could not find guides on the plugins.
Anyone knows how to solve it?

Is SyntaxNet Compatible with Open NLP?

I am new to OpenNLP and have used it to get some parsing sentences.
I saw in Google TensorFlow youtube videos that the Penn Treebank is old and sort of outdated.
They have made another parsing model named SyntaxNet available as Open Source.
My question is that is it possible to use SyntaxNet Models in Apache OpenNLP libraries.
SyntaxNet is a part of TensorFlow repo in git and is Python based.
I am a Java Developer.
Thanks in Advance.
OpenNLP would pick any tagging or syntactic/semantic model as long as you create the training data yourself. In this case, you would need to train POS tagger with their set of tags and also OpenNLP chunker and / or parser to implement SyntaxNet.
That said, my personal take on this is that SyntaxNet is a very opinionated piece and there is no reason to use it for instead of Penn Treebank. By doing so, you're locking yourself in a Google's solution.

Natural Language Process using SharpNLP with a sample

I am really new to C# and want to do a NLP project using SharpNLP. I know Currently it provides the following NLP tools:
sentence splitter
tokenizer
part-of-speech tagger
chunker
parser
name finder
coreference tool
interface to the WordNet lexical database
I tried several examples (i have those .nbin models) but failed to integrate the SharNLP tools to VS 2015. Can anyone give some guidance or samples to use this sharpNLP tool with VS.
Thanks
I have successfully created a sample project for newbies. You can get the project from the following link. PS:Please change the nbin file path as your's computer file path. Hope this will help.
Click HERE for Sample Project or as below.
https://drive.google.com/file/d/0B3XcMZLArSF1UURzODRiVmE0RUE/view?usp=sharing

Full Text search engine other than Lucene Search text

I need a full text search engine which should support internationalization.
Thanks
Use Sphinx with MySQL
There is one called Xapian. I haven't used it but I've heard good things.
I've used Ferret ( Ruby ) and worked for me, unfortunately it only works in ruby 1.8.x , (It's not supported in Ruby 1.9)
Other solutions were already mentioned: Sphinx, Xapian , also SolR ( based on Java/Lucene) should work)
an old yet good reference.
http://wrg.upf.edu/WRG/dctos/Middleton-Baeza.pdf

Resources