Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 3 years ago.
Improve this question
I have searched enough on the web but did not find the solution.
Is there a remote GUI client for Elastic Search server just like Oracle SQL Developer in order to see the schema & other details of the remote elastic db.
Currently I am using the elastic head plugin
It doesn't let me connect to the remote elastic cluster. It only works if the elastic server is hosted in the same machine. I also added the below entries to elastic.yml file but doesn't works. Says no connection to the remote host.
#http.cors.enable: true
#http.cors.allow-origin: "remotehosturl:9200"
You need to remove the # character in front your two lines, as that comments out the line and thus has no effect.
Also the correct settings for CORS is named http.cors.enabled not http.cors.enable
So you should include these two lines:
http.cors.enabled: true
http.cors.allow-origin: "remotehosturl:9200"
Also you have the choice with other plugins, such as Marvel, Kopf or the Sense Chrome plugin (soon available as a Kibana-powered standalone tool)
Dejavu is a MIT-licensed modern alternative to Elasticsearch Head, I am one of the contributors to the project.
You can use it as a remote web app, a chrome extension or as a docker image.
It supports:
Excel like UI for CRUD operations - including ability to view and add mappings from GUI,
Visual filters,
Ability to import CSV / JSON files directly,
Query views,
Export data in CSV / JSON formats.
When using it in a remote mode, you will have to set the Elasticsearch config to allow CORS from the origin where dejavu's app is running.
You can read more about the project at https://github.com/appbaseio/dejavu.
Marvel's sense is official GUI client for elasticsearch maintained by elastic.It is now even free to use in production with ES release 2.0. Sense query user interface has intellisense hooked with it which is very usefull when writing complex queries and offer lot of other metrics about cluster health, CPU load, memory(build on top of kibana).I prefer sense over head.It is worth taking a look atleast.
You have to install this plugin on your remote server.
Installation- https://www.elastic.co/downloads/marvel
Related
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 2 years ago.
Improve this question
Is it possible to integrate Checkmarx Static Application Security Testing (SAST) tool into Gitlab Continuous Integration (CI) Pipeline for static security scanning?
I have been using Checkmarx with TeamCity and Jenkins pipeline with their Plugin. However, for GitLab pipeline, we need to use REST APIs/ CLI. I would prefer using CLI over REST APIs as CLI provides more functionality that can be used for pipeline decisions.
You can check their Wiki-
https://checkmarx.atlassian.net/wiki/spaces/KC/pages/5767170/CxSAST+API+Guide
https://checkmarx.atlassian.net/wiki/spaces/KC/pages/52560015/CxConsole+CxSAST+CLI
You can always raise a support ticket for getting the recommended approach by Checkmarx.
For now, no, Checkmarx don't have a special plugin for GitLab integration, but they have really good article how-to enable and configure integration:
https://checkmarx.atlassian.net/wiki/spaces/SD/pages/1929937052/GitLab+Integration
I choose free style rather then going for pipeline job in jenkins.
Here is how I configured even without checkmarx plugin.
First generate a token using below command
runCxConsole.cmd GenerateToken -v -CxUser username -CxPassword admin -CxServer http://localhost
Congfiure below lines of code in Build --> Execute Shell
Jenkins Script
#!/bin/bash
export JAVA_HOME=/usr/bin/java
export CHECKMARX_HOME=/<checkmarx plugin path>/CxConsolePlugin-8.90.2
echo ${WORKSPACE}
echo $CX_PROJECT_NAME
mkdir ${WORKSPACE}/cxReports
export CHECKMARX_REPORTS_HOME=${WORKSPACE}/cxReports
echo $CHECKMARX_REPORTS_HOME
$CHECKMARX_HOME/runCxConsole.sh Scan -v -CxServer <checkmarx server details> -ProjectName "<project anme>" -cxToken <token> -locationtype folder -locationpath "${WORKSPACE}" -preset "Default Checkamrx" -reportcsv $CHECKMARX_REPORTS_HOME/$CX_PROJECT_NAME.csv -ReportPDF $CHECKMARX_REPORTS_HOME/$CX_PROJECT_NAME.pdf
Note:Always use token for authentication with the server instead hard coding the username and password in the CLI command.
For more information you can visit https://checkmarx.atlassian.net/wiki/spaces/SD/pages/222232891/Authentication+Login+to+the+CLI
https://checkmarx.atlassian.net/wiki/spaces/SD/pages/1929937052/GitLab+Integration
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 6 years ago.
Improve this question
I am attempting to migrate my server from Parse.com over to Heroku with their one click migration. Their documentation says that Parse Server supports "file" type, but I can't find any documentation on transferring these files so Heroku can access them.
This isn't an answer but I've been having the same issue/dilemma and have partial information that might be helpful in eventually finding an answer. I did a migration and took a look at some of the stuff going on.
Example, photoObj.get('file').url();
On Parse Hosting: files point to the following:
http://files.parsetfss.com/parseFileKey/fileName.ext
This is stored on some amazon S3 thing. Basically this points to:
https://s3.amazonaws.com/files.parsetfss.com/parseFileKey/fileName.ext
After migrating to Heroku/MongoLab, photoObj.get('file').url() points to the following:
http://files.parsetfss.com/newHostFileKey/fileName.ext
newHostFileKey is something we designate in the parse-server setup and seems to be automatically generated via this setting.
I don't see any evidence so far that the migration tool moves files from Parse Hosting to the new host/db.
File uploading to the new host works fine. On the new host, if one generates a new file it ends up pointing to something like this:
http://newHostURL/parse/files/appID/fileName.ext
parse is whatever you designate at the startup of your parse-server like app.use('/parse', api);
appID is whatever you designate at the startup of your parse-server like
var api = new ParseServer({
appId: 'appID',
fileKey: 'newHostFileKey'
});
Changing the url point of an Parse Hosted file to fit the new host pattern doesn't yield anything (file not found) etc.
I have no idea how new files are being stored and to where the url routes to.
With new files that are uploaded via the new host, I notice that some new tables/collections are created in the MongoLab DB. These are fs.chunks and fs.files
fs.chunks is where the data of the file is being stored (I think). So under the new heroku/mongolab setup, files seem to reside "in" the DB.
As for what the best way is to migrate images from Parse hosting to new hosting is, I have no idea but I'm not sure there is a straightforward answer that is publicly out there at this point.
Just installed and started using VSSonarExtension, but already have a couple of questions that I am unable to answer myself:
Why does the extension need to connect to the server if it scans local code with local tools, such as FxCop and StyleCop?
After looking through roughly 15 files, I get an error saying that I need a license to scan more than 15 files in one session - isn't the extension open source?
I want to make sure that the rules specified on the server match the rules locally, so that I get the same output with regards to technical debt - how can I achieve it?
Seems like the extension does not use FxCop, according to the logs; moreover, sometimes logs indicate that some code has been skipped because some rules were not present or defined in StyleCop. What does that mean?
Looking forward to your replies, thanks for help.
answers
so it can sync the quality profile in server, you dont want your users to complain the local analysis is different from what they have in server
local files analysis, is provided by external plugin: please contact support their for details
its the default behaviour. therefore ansering question 1
please open a ticket in github with those issues. it can be that you have some differences between server profile and what is installed locally.
I am completely new to web development and I want to set up a database search client on a website I am making because I do not want to write my own inefficient MySQL query strings. My plan is to use Elastic Search for this and my main question is:
Once my site is on a dedicated server somewhere, how do I install Elastic Search to the server, and/or what should I look for in a server so that I will be able to use Elastic Search?
You'll need a JRE, that's about it as far just getting started.
See http://www.elasticsearch.org/guide/reference/setup/installation.html for more details.
You'll probably want to run it as a service, If your using Windows you can download installers here. https://github.com/rgl/elasticsearch-setup/downloads
Hit me up if you need any help.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 7 years ago.
Improve this question
I just found the wonderful ElasticFox, a Firefox plugin that makes working with Amazon EC2 much more enjoyable. Is there a similar tool for Amazon RDS?
Or, rather, what is the best/easiest tool to work with RDS?
I have been using MySQL Workbench http://www.mysql.com/products/workbench/ with RDS and it works great. Very easy to create and save a new database service instance. Click "New Server Instance" under "Server Administration" and follow the prompts. You will need to enter the information provided in the AWS RDS webpage for that instance (for example, it's endpoint).
NOTE: In order for you to actually connect, you MUST add your IP address in the "DB Security Groups." The link is in the left-hand column, which is titled "Navigation." I use the "CIDR/IP" option (the other is EC2 Security Group). Make sure to include a /## after the IP, such as the /32 they use in the example. In a few seconds, the IP address should be authorized.
After the new security group has been authorized, the "DB Security Groups" of the DB Instance running MySql needs to be updated to include this newly created security group. After this updation, the "DB Security Groups" should show atleast two 'active' security groups, one which was already present previously and other which was newly created in the previous step.
After that, go back to MySQL Workbench and complete the New Server Instance creation process.
I'd say the AWS Console and RDS CLI along with MySQL client itself are totally sufficient.
Anything particular you are looking for?
AWS console is well enough to monitor and configure the RDS. However we cant change some parameters with AWS Console (like mysql.ini parameters). In that case you have to use RDS Command Line tools.
Still if you dont want to mess with Command line APIs, you can use cloud management systems and use it (free edition) as GUI tool such as RightScale
Here is post you can see how third party GUI tools can be used to work with Amazon RDS
Try DBHawk from Datasparc. It can connect to cloud databases such as Amazon RDS and MS Azure.