There are two ways to modify these files to add user-defined functions to GSQL:
(1)Store the files in a GitHub repository, and configure GSQL to read from the repository.
(2)Use GET and PUT commands to download, modify, and store the files locally.
how to achieve the steps?
help from anyone would be appreciated
https://docs.tigergraph.com/gsql-ref/current/querying/func/query-user-defined-functions
I tried creating github account adding the udf file to the gsql but doesn't work. And the expectation is mentioned above.
Related
I want to clone ELK8 integrations repository into a local server and modify a few of them to remove some of their inputs. Would anyone please guide me step by step what I should do? In details, I would like to know
Where in the codes below should I change and How to build them?
https://github.com/elastic/integrations
Should I also clone and change Elasticsearch Package Repository codes?
https://github.com/elastic/package-registry
How to route kibana to get integrations list from the cloned repository?
I had a task to delete old SNAPSHOT artefacts which are under many folders/directories.
We can't go and delete each and every artefact manually so I would like to go with restAPI.
For clear info:
https://artifactory.com/artifactory/maven-local/com/aa/bbb/cccc/dddd/XYZ-SNAPSHOT/abc.jar
https://artifactory.com/artifactory/maven-local/com/aa/bbb/cccc/dddd/XYZ-SNAPSHOT/xyz.jar
https://artifactory.com/artifactory/maven-local/com/aa/bbb/cccc/eeee/XYZ-SNAPSHOT/pqr.jar
https://artifactory.com/artifactory/maven-local/com/aa/bbb/dddd/eeee/XYZ-SNAPSHOT/lmn.jar
Above 4 examples have different directories.
My script needs to go each and every directory and have to verify for XYZ-SNAPSHOT, if it found then we can make a url and delete through CURL.
How can we achieve this? Or is there any other way to do it?
You should probably want to use Artifactory Query Language (AQL) which is the easiest way to find artifacts and modules according to patterns. You can find bunch of examples in the page. Moreover, to perform the deletion easily and even automate the process in the future, I advise using JFrog CLI. You can also read this interesting blog about similar use case.
Also, there is the 'Max Unique Snapshots' field in your local Maven repository settings. You can use that for Artifactory to keep a specified number of unique snapshots per artifact.
I want to create automatic upload to ftp, using 'FTP Upload' runner, with different build configuration, which depends on successfull build of main configuration. But the thing is I don't know the pattern. As for now path looks like this:
C:\ProgramData\JetBrains\TeamCity\system\artifacts\<project_name>\<build config name>\528
What variable contains this last number?
The problem was with bad description of my problem, more definiteve one:
I have to store artifacts on FTP. FTP is on the same machine as TC server and agent (don't ask me why). So I have to somehow grab artifacts and put them into ftp://"project"/msi and ftp://"project"/nuget, depending on build configuration. I've tried: Grabbing artifacts directly - from folder shown in the initial post, idea failed.
The solution is to create another build configuration and set Artifact dependencies, this makes artifacts reachable from new build configuration, which allows to use FTP Upload runner.
Thanks everyone!
I had upload jar to server, but I got below error. I had tried so many times.
And then I log into https://oss.sonatype.org/#stagingRepositories, I can see the list of Staging Repositories, but I cannot drop or release the repository. It said 403.
Could any one help me ? Thanks a lot.
Your account doesn't have permissions to perform these actions.
This isn't the right place to ask for help with this, file an issue at https://issues.sonatype.org in the "Community Support - Open Source Project Repository Hosting (OSSRH)" project.
I've been tasked with writing scripts to interact with Nexus/Maven. The files I'm working with in Maven are XML files placed there with the specific idea that they would be used by shell scripts. Essentially, the files are configurations for another application.
I've already completed the scripts to pull the files from the repositories, but I'm having problems with putting files into the repositories. To pull the files, I'm using the plugin dependency:get.
What I need is more or less the opposite of that plugin. One that will update the repository with new versions of a file. I think that "mvn deploy:deploy-file" is what I need to use. Will that work?
If so, then the next problem I have is that I can't seem to figure out how to set up the authorization. I have a settings file with a server defined that has the correct authorization information in it, but the link between the server and the repository (or URL?) is missing and the authorization isn't being performed correctly.
How do I connect the repository URL to the server info in the settings.xml file so that mvn will be authorized to perform the correct actions? (I don't know where the .pom file is for Maven, and may not have permissions to alter it.)
Thanks,
Sean.
deploy:deploy-file is correct. Use it with -Durl=http://repo:port/path, -DrepositoryId=server-whatever. Your settings.xml needs to contain
<servers>
<server>
<id>server-whatever</id>
<username>demo</username>
<password>demo</password>
</server>
</servers>
where the server ID server-whatever matches the repositoryId parameter.
Having said that, I'd question the appropriateness of Maven for this. It's designed for binary artifaccts rather than configuration.
The problem turned out to be in the -Durl option.
When using the dependency:get plugin, the URL was something like:
-Durl=http://companymavenrepo
And that worked fine for the dependency plugin.
However, that's not sufficient when trying to put things into the repository using the deploy plugin. The URL has to contain the maven server and the exact repository of where to place the artifact. (My terminology might be off.) I went to our Nexus/Sonatype webpage, looked at the exact repository where the artifact was stored, then used something like this:
-Durl=http://companymavenrepo/nexus/content/repositories/this_maven_repo
That solved the authorization problem, and I was able to add the file into the repository without issue.
(This might have been easier for other to see had I posted both mvn command lines I was trying to use. On the other hand, it also seems reasonable that when you use the -Durl option with a specific value in one command line and it works that it will work unchanged in another command line.)
Sean.