I have read somewhere that it is possible to export JMeter configuration file to a repository e.g.github or bitbucket but I have been unable to find the file.
I can set up JMeter to run and can interpret the results but my aim is to share the configuration file via a repository. I was expecting to find the configuration file in the JMeter /bin directory but it doesn't seem to be there. Where else could it be located?
Following are the steps to follow:
Sign Up on Github here
Create a public or private repository of your choice (by clicking on New Repository button). public - all users can access your repository. private - only those users you added,can access the repository. private repositories come with a price tag, where public repositories are free.
Once the repository is created, you can upload files from your PC to the repository using Upload files. button. select jmeter.properties or any config file you want to share.
Once all the files are uploaded, commit the changes by giving the summary of what files you are uploading in the text fields and click on Commit Changes button. this is just for reference. With this step, you saved your config file in your GitHub account.
If it public repository, you can share the URL to the repository to whomever you want. If it is a private repositiory, you can add them as Contributors to the repository, so that they can download/update the files in the repository.
You can download GutHub desktop application to perform operations on the repositories.
Why are you preferring GitHub?? A simple way to achieve the task is upload to one of the cloud storage applications like OneDriver, Google Drive, DropBox etc. and share from there.
GitHub is version control tool which is used to maintain the projects (maintains the code version wise), though you can use it for uploading and sharing files.
Related
I have Artifactory set up and working, serving other artifacts (RPM, etc)
I would like to have local copies of public and private Go programs and libraries
to ensure version consistency
to let public repositories get bugs out
to let public repositories secure from unauthorized alterations
I've created a Go repository in Artifactory, and populated it with, as an example, spf13/viper using frog-cli (which created a zip file and a mod file)
Questions:
Is the zip file the proper way to store Go modules in Artifactory?
How does one use the zip file in a Go program? E.g. the URL to get the zip file is http://hostname/artifactory/reponame/github.com/spf13/viper/#v/v1.6.1.zip (and .mod for the mod file) E.g., do I set GOPATH to some value?
Is there a way to ensure all requirements are automatically included in the local Artifactory repository? At the time of the primary package's (e.g. viper) inclusion into the local Artifactory repository?
Answering 3rd question first -
Here's another article that will help - https://jfrog.com/blog/why-goproxy-matters-and-which-to-pick/. There are two ways to publish private go modules to Artifactory. The first is a traditional way i.e. via JFrog CLI that's highlighted in another article.
Another way is to point a remote repository to a private GitHub repository. This capability was added recently. In this case, a virtual repository will have two remotes. The first remote repository defaults to GoCenter via which public go modules are fetched. The second remote repository points to private VCS systems.
Setting GOPROXY to ONLY the virtual go modules repository will ensure that Artifactory continues to be a source of truth for both public and private go modules. If you want to store complied go binaries, you can use a local generic repository but would advise using a custom layout to structure the contents of a generic repository.
Answering the first 2 questions -
Go module is a package manager in Golang similar to what maven is for Java. In Artifactory, for every go module, there are 3 files for every go module version: go.mod, .info, and the archive file.
Artifactory follows the GOPROXY protocol, hence the dependencies mentioned in the go.mod will be automatically fetched from the virtual repository. This will include the archive file too which is a collection of packages (source files).
There's additional metadata that's stored for public go modules such as tile and lookup requests since GoSumDB requests are cached to ensure that Artifactory remains the source of truth for modules and metadata even in an air-gapped environment.
Our organization has a locally running instance of Artifactory, and also a local instance of Bitbucket. We are trying to get them to play well together so that Artifactory can serve up our private PHP packages right out of Bitbucket.
Specifically, we'd like to create a Composer Remote Repository in Artifactory that serves up our private PHP packages, where those packages are sourced from git repositories on our local Bitbucket server.
Note that we'd rather not create and upload our own package zip files for each new package version, as suggested here. Ideally, we just want to be able to commit changes to a PHP package in BitBucket, tag those changes as a new package version, and have that new version be automatically picked up and served by Artifactory.
The Artifactory Composer documentation suggests that this is possible:
A Composer remote repository in Artifactory can proxy packagist.org
and other Artifactory Composer repositories for index files, and
version control systems such as GitHub or BitBucket, or local
Composer repositories in other Artifactory instances for binaries.
We've spent a lot of time trying making this work, but haven't been able to do it. The Remote Repository that we create always remains empty, no matter what we do. Can anyone offer an example to help, or even just confirm that what we're attempting isn't possible?
For reference, we've been trying to find the right settings to put into this setup page:
Thanks!
Artifactory won't download and pack the sources for you, it expects to find binary artifacts.
The mention of source control in the documentation refers to downloading the archives from source control systems, either uploaded there as archives (don't do that), or packed by the source control system on download request (that is what you are looking for).
You can use this REST API to download automatically generated zips from BitBucket. If you can configure the composer client to look for the packages in the right place, you're all set.
Basically we already had a nexus server configured but now we do not have access to that server for some reason. Now, I have the repository folder which contains all the dependency I will require to upload it in new nexus server.
So, for this I can upload each dependency one by one using command line or by Nexus UI. But I have multiple dependencies So I am looking for alternative
So, I found this solution
Here
And I am following 3rd suggestion but I am not getting which access I will needed. Because I do not see such directory in my local machine.
I have extracted nexus-3.12.1-01 setup in my windows machine.
Where can I found this folder?
I'm the only developer in a team and will be working on a private project that doesn't need to be placed on the Github or somewhere else online. My entire project will be located in one of my local machine folders. Is this possible to create a project in TeamCity that points to my local folder? I'm using TC version 10. When I navigate to Create Project i only see Manually, Github, URL Repository and BitBucket Cloud Repository. Logically thinking I went to set up the project Manually, but there is a field Project ID which seems like require some sort of URL. Just curious if this ever possible with Team City? Thanks.
Yes, it is possible.
Choose git as a type of repository and in mandatory field Fetch URL specify local path: /path/to/repository.
Click Test connection to make sure Teamcity is able to fetch data.
You won't be able to configure triggers against this repository, it is still open issue: https://youtrack.jetbrains.com/issue/TW-12162
See screenshot .
with TC 2017.2.2 from URL Repository you can just give the path
file:///home/user/dirOfProject
The requirements are as follows. We need copies from binaries we need in our projects on our repository server. We can't just proxy the public repository because we had several cases in the past where the binaries on the public repository were changed without changing the release number and we want to avoid problems imposed by that, thus we want to manually specify when to download it from the public repository and when to update. No changes are ever to be made to the binary stored on our repository server without manual interaction.
Is there a way achieve this? I.e. to say "I want artefacts X, Y, Z" copied to my repository server(preferably including their dependencies). Is this possible with either Nexus or Artifactory?
Yes. In Nexus define your own local repository, manually download the versions you want and add them to your repository. You may have to set up "manual routing" for dependency resolution to ensure that Nexus consults the repos in the correct order.
Then make sure your pom files refer to the specific versions you have downloaded.
One thing that will make this a little easier is that you can place the downloaded artifacts directly into the local storage directory of a Nexus repository (you don't need to upload them into Nexus).
See here for details: https://support.sonatype.com/entries/38605563