elasticsearch downloadable manual - elasticsearch

Does anyone know where I can find a downloadable manual (any format) for elasticsearch? http://www.elasticsearch.org/guide/ is OK, but sometimes the site is not reachable, and searching for content is not practical.

Not that I am aware of. You can fork and run the site locally if you have problems with it not being reachable which btw never happens for me. It is available on github: https://github.com/elastic/elasticsearch/tree/master/docs

Cloning the repo is useless. You cannot build their docs since the asciidoc files for the build --all command are access protected.
The only solution is to crawl their site, for example with wget:
wget -r -l1 --page-requisites -N --convert-links -E robots=off "https://www.elastic.co/guide/en/elasticsearch/reference/6.5/index.html"
the index.html contains all the links of the doc subdirectories, so you can specify -l1 which will make wget crawl only one level deep
note that you cannot use the search function on crawled docs since there is no server running in the background

Semi related and perhaps helpful - ES docs are available as a DocSet for the Dash MacOS and iOS app for offline docs. May or may not be useful but I find it great for travel. It cost $ but can be used for free as well.

Here is the GitHub repo for all the documentation for their products:
https://github.com/elastic/docs
Please note the following disclaimer in the repo:
Conditions of use
This documentation build process is provided to the public purely for the
purpose of testing documentation changes before submitting pull requests to
the appropriate Elasticsearch repository.
The documents produced by this build process may be published only on
http://www.elastic.co. They may not be published in any other form or on
any other website without explicit permission.

Related

Kibana plugin resources

Heyy, I'm looking for up-to-date kibana plug-in resources to help me learn and understand how to develop one.
All the resources I've found are out-of-date.
Can anyone help, please?
I suggest understanding plugin directory structure for traffic plugin which would be one of simplest plugin to understand and you could directly add it to your installed plugin folder in kibana and see it working.
Other than that I would also suggest you do read Tim Roes blogs for developing kibana plugins.
Last I would also suggest to use elasticsearch discussion forum for kibana related issues as well for quicker responses.
As kibana has different js files which loaded synchrnized way. you can use as it required.

How to host a privately owned documentation with ReadTheDocs or Sphinx

I am totally new to this thing. Spent a whole day trying to figure out the "most commonly used" approach. What I want to implement is something like readthedocs.org, but for a private customer (and proprietary project).
Almost all of the FAQs, blog posts, howtos, etc, are describing how to host (publish) documentation either with GitHub pages, either with readthedocs.org (.com)
I've tried to use Sphinx (NB: NOT a "Sphinx Search") locally, and I could quite easily build a sample demo docs, but I don't exactly understand how to host a "searchable" solution, like the one it works on http://www.sphinx-doc.org (seems like it uses readthedocs.org as a search backend, though).
I've tried to deploy readthedocs.org locally, but:
The "search" doesn't work (nobody listens on 127.0.0.1:9200).
I was unable to build any documentation (Version not found or
Project not found).
I was unable to add project from my private repository (ssh:)
(NB: I was trying it on Windows, and that might explain items 1-2, but not 3, I believe.)
So far it feels like I've run out of ideas..
Any advice will be highly appreciated !
The only thing you need to host sphinx documentation is a static file server (the search works without a back end, see my answer here.
That said, using a private readthedocs server is probably over-engineering.
Just deploy the files to a static file server and point the base URL (e.g. docs.myapp.com) to the index.html file.
You can automate the deployment with git hooks.
For the sake of completeness: I am sure it is possible to get a local readthedocs server to build your project. But readthedocs is explicitly not designed for On Premise deployments and you might find it hard to get professional support. I was involved in a scenario where the Dev Ops team decided it's way easier to automate the deployment using their usual set of tools after we struggled with build/performance issues of our local readthedocs instance.
If you want to host static documentation you can do this by setting up a static file server like nginx.
Just this file in /etc/nginx/sites-available/default:
server {
listen 80 default_server;
index index.html index.htm index.nginx-debian.html;
server_name _;
location /doc/your-docs {
root /path/to/docs;
}
}
We built a simple tool around this concept to selfhost documentation for multiple projects and version them:
https://github.com/docat-org/docat

Trigger teamcity build when network folder changes

Can't seem to find this question asked anywhere... but I would like to trigger a teamcity build when a network folder is updated. This is content for our installer, too big to put into github, hence managed by a team internally.
Seems like the sort of thing someone would have written a plugin for, but I can't find one. Does anyone have a solution for this? Ideally I'd just point the trigger at a network folder and teamcity would start a build whenever that folder gets updated.
Not sure if monitoring network folder is good, scalable solution, there are a couple of alternative approaches, which might help in your case:
seems you're already using TeamCity, maybe even building your installer in TeamCity, then you might make use of Snapshot or Atrifact dependencies, or use Finish Build trigger.
you could trigger a build in TeamCity via REST API, by a tool/script uploading your installer to the remote folder, basically just executing POST request (example curl request might look like curl http://teamcity-host/app/rest/buildQueue --request POST --user user:password -H "Content-Type:application/xml" -d "<build><buildType id='buildToTriggerId'/></build>"), here's corresponding REST API documentation.
Update
Actually, there is TeamCity plugin to monitor the content (changes) returned by a specified URL, file or directory, too: Url Build Trigger

Google PageSpeed not updating cache

I have google's pagespeed installed with nginx server installed following here. I need to flush/delete the previous cached content but could not find a solution. On pagespeed site its mentioned to use this command:
touch /var/ngx_pagespeed_cache/cache.flush
But I have no success with it. Thanks for any help.
try this click here for more
sudo touch /var/cache/mod_pagespeed/cache.flush
Is /var/ngx_pagespeed_cache your caching folder? If so, this should work. As Dayo noted, we do not delete the files, just invalidate them.
However you can also just rm -r the caching folder and then reload Nginx (to clear the in-memory cache). If you are using memcached, you'd have to clear that too.

How to pull from a fellow developers repository using Mercurial

I'm trying to setup Mercurial on developer workstations so that they can pull from each other.
I don't want to push.
I know each workstation needs to run
hg serve
The format of the pull command is
hg pull ssh:[SOURCE]
What I'm having problem with is defining SOURCE, and any other permission issues.
I would believe that SOURCE ends with the name of the repository being pulled from. What I don't know is form the host name. Can I use IPs instead?
What permission issues do I need to look out for?
SOURCE == //<hostname>/<repository>
All developers or test stations are running Windows 7 or Windows XP.
I have searched for this answer and have come up empty. I did look at all the questions suggested by SO as I typed this question.
This is probably a simple Windows concept, but I'm not an expert in simple Windows concepts. :)
The hg help urls output has these examples:
Valid URLs are of the form:
local/filesystem/path[#revision]
file://local/filesystem/path[#revision]
http://[user[:pass]#]host[:port]/[path][#revision]
https://[user[:pass]#]host[:port]/[path][#revision]
ssh://[user#]host[:port]/[path][#revision]
and a lot of info about what can be used for each component (host can be anything that your dns resolver resolves or a ipv4 or ipv6 address. I beleive on windows systems UNC paths count.
Also, you appear to have some confusion about when you can use ssh. You can use ssh:// URLs to access repositories on the file systems of systems that are running ssh servers. If they're running hg serve then you can access them using the http:// URL that hg serve gives you when you start it. It's usually used for quick "here grab this from me and see if you can tell me what I'm doing wrong" situations rather than for all-the-time sharing.

Resources