As developer, I want to know if there are best practices to put project maintainers inside a Go project.
In php projects this can be done updating composer.json file with content similar to:
{
"authors": [
{
"name": "Name Surname",
"email": "foo#bar.man",
"role": "Developer"
}
]
}
Is there a standard Go alternative?
No. Go packages do not track the developer(s) or contributor(s) in any prescribed way.
Of course you can always add them to a README, wiki, code comments, or other human-consumable information as you see fit.
Related
We have an SDK project that includes a test engine. These live under two different csproj's. So we have SDK.csproj and TestEngine.csproj where SDK.csproj is a ProjectReference.
I have a DocFx project set up that builds metadata for these two separately, with docfx.json metadata that looks like this:
{
"src": [
{
"src": "../../sdk",
"files": ["csharp/SDK/**/*.cs"]
}],
"dest": "reference/SDK"
},
{
"src": [
{
"src": "../../sdk",
"files": ["csharp/TestEngine/**/*.cs"]
}],
"dest": "reference/TestEngine"
}
This way I can set up TOC's to put these documentation trees under separate tabs.
However, I cannot use a cref in TestEngine xml docs that refers to a class from SDK. I get an error like this from DocFX build:
Warning:[MetadataCommand.ExtractMetadata]Invalid cref value "!:SDK.SDKClass" found in triple-slash-comments for TestEngineClass, ignored.
I can imagine why this fails - DocFX is generating the metadata for TestEngine alone so it doesn't know about the SDK classes or how to link to them. Is there a way I can change the configuration so that I can keep these two projects separate (under separate TOC's) in the final website but still link from TestEngine to SDK classes?
I realized that using # and/or xref tags as described in the DocFx documentation do resolve to links properly in the generated web pages. So that gets me a lot of what I want. However, it's not a complete solution as other generated references do not resolve to links. For example, if a TestEngine method has a parameter of type SDK.SDKClass, the generated docs won't make a link for SDKClass where it appears on the parameter documentation. So I'm still wondering if there is another solution.
Given a solution containing both a Web Api project, an Api Client Project, and an assembly project containing shared Dto objects as follows:
Solution
Web Api Project
Shared Dto Project
Api Client Project
Unit Tests
I have an Azure Devops build pipeline to build and test all of the projects.
Within that build pipeline, I have a step that creates 2 Nuget packages for the Api Client and the Shared Dto projects.
However, I don't need to create new packages all of the time, say I only added a unit test or fixed a bug in the Web Api project, I don't want to create new Nuget packages (or worry about bumping the Package version) if nothing in the Api Client Project (or its dependencies) has changed, but I do still want to regression test it.
I thought that perhaps I could do this by having multiple build pipelines, one for the Web Api project, and others for the Api Client & Shared projects, then use path filters to trigger the appropriate builds.
Are there any other ways to achieve this?
Whats the best practice?
I don't want to have to maintain more build definitions than I need.
I ended up using a slight modification of
this answer
# Checks which files have been updated to determine if new Nuget packages are required
$editedFiles = git diff HEAD HEAD~ --name-only
echo "$($editedFiles.Length) files modified:"
$editedFiles | ForEach-Object {
echo $_
Switch -Wildcard ($_ ) {
'Whds.Configuration.Models/*' {
# If the Models project is updated, we need to create both packages
Write-Output "##vso[task.setvariable variable=CreateModelsPackage]True"
Write-Output "##vso[task.setvariable variable=CreateClientPackage]True"
}
'Whds.Configuration.Standard/*' { Write-Output "##vso[task.setvariable
variable=CreateClientPackage]True" }
# The rest of your path filters
}
}
This script sets variables which are then referenced in custom conditions in the dotnet pack step in the build pipeline:
and(succeeded(), eq(variables['CreateModelsPackage'], 'True'))
If the Dto project is changed, both variables are set in order to create both packages.
If only the client (aka Standard) project is the only thing that has changed, the package for the Dto project will not be created.
Are there any other ways to achieve this? Whats the best practice? I don't want to have to maintain more build definitions than I need.
There are different ways to achieve it, but not sure which one is the best practice, it all depends on your requirements or tastes.
The simple method is similar to your thought. Also need to create a new build pipeline. The difference is that we do not need to maintain this build definition.
Details:
Add a new pipeline without any more task in this pipeline, and use
path filters to trigger the appropriate builds (Api Client and the
Shared Dto projects).
Add a build completion to your original Azure Devops build pipeline:
Add a custom condition for the step that creates 2 Nuget packages based on the Build.Reason, like:
and(succeeded(), eq(variables['Build.Reason'], 'BuildCompletion'))
Now, the steps to create 2 Nuget packages only executed when the file changes come from a specific project. Of course the limitation of this solution is that if you already have a build completion, it will not work.
If the above method is not what you want, we could invoke the REST API commits to get the commit info for each build:
GET https://dev.azure.com/{organization}/{project}/_apis/git/repositories/{repositoryId}/commits/{commitId}?changeCount=100&api-version=5.1
We could find the changes/path in the returned body:
"changes": [
{
"item": {
"gitObjectType": "blob",
"path": "/.gitattributes",
"url": "https://dev.azure.com/fabrikam/_apis/git/repositories/278d5cd2-584d-4b63-824a-2ba458937249/items/.gitattributes?versionType=Commit"
},
"changeType": "add"
},
{
"item": {
"gitObjectType": "blob",
"path": "/.gitignore",
"url": "https://dev.azure.com/fabrikam/_apis/git/repositories/278d5cd2-584d-4b63-824a-2ba458937249/items/.gitignore?versionType=Commit"
},
"changeType": "add"
},
{
"item": {
"gitObjectType": "tree",
"path": "/MyWebSite",
"isFolder": true,
"url": "https://dev.azure.com/fabrikam/_apis/git/repositories/278d5cd2-584d-4b63-824a-2ba458937249/items/MyWebSite?versionType=Commit"
},
"changeType": "add"
},
Then we could use the powershell script to traverse these paths to see if they include Api Client and the Shared Dto projects, if yes, we set a variables with a value, add condition based on this value for the steps that creates 2 Nuget packages.
Note: Before use the REST API commits, we need use the Commits - Get Commits to get the latest commit Id:
GET https://dev.azure.com/{organization}/{project}/_apis/git/repositories/{repositoryId}/commits?$top=1&api-version=5.1
Hope this helps.
here i have a Media class at Spatie\MediaLibrary\Models\Media path. it was creating PDO issues with this code.
class Media extends Model implements Responsable, Htmlable
so i tried this in there and it worked for the cause.
use Jenssegers\Mongodb\Eloquent\Model as Eloquent;
class Media extends Eloquent implements Responsable, Htmlable
now its fine as long its working but what about updating composer will remove everything i guess and PDO problem will appear again. so how can i publish it out there to make it permanent?
I think you best bet here would be to fork the spatie/laravel-medialibrary repository. You can then make your changes in your fork and commit them. Then you'll be able to use your commit as the package version in your composer.json and your fork as the repository.
For example in your composer.json change your spatie/laravel-medialibrary requirement to "spatie/laravel-medialibrary": "dev-{your-branch-name}#{your commit hash}",. And add a "repositories" field to your composer.json like this:
"repositories": [
{
"type": "vcs",
"url": "https://github.com/{your github username}/laravel-medialibrary"
}
]
Have a look at this StackOverflow question for more examples on how to use your own commit in your package. Also have a look at Composer's "Repositories" documentation to see other ways to add repositories (for example to use a local path) and their "Versions#Branches" documentation to see how to specify branches as versions.
I've started playing with strapi to see what it can do.
When generating an api through strapi studio, it generates a set of base files to handle the model and api calls.
In the entity folder (e.g. article), there's a templates/default folder created with a default template. For an article entity, I get a ArticleDefault.template.json file with this:
{
"default": {
"attributes": {
"title": {},
"content": {}
},
"displayedAttribute": "title"
}
}
In strapi studio I also then add additional templates for each entity, given it multiple templates.
The command line api generator does not create the templates folder.
I couldn't find anything about it in the documentation I read.
What are the generated templates for?
When would I use them, and how would I choose a particular template if I have multiple?
I'm one of the authors of Strapi.
A template is like a schema of data. Let’s take a simple example. You have an API called Post, sometimes your post have a title and a content attribute, but other times, your post have a title, a subtitle, a cover and a content attribute. In both cases, we’re talking about the same API Post but your schema of data is different. That’s why we implemented the templates! Your needs could be different for the same content.
Then, as you said the CLI doesn't generate a template folder in project. The Studio doesn't use the same generator as the CLI but the behavior of your API is the same.
Where is the best place to put a Form Macro in a Laravel 4 package? Looking through the package documentation, I don't see an obvious place. Seems like it won't fit anywhere under /src. I've only been learning Laravel 4 for 2 weeks, so I'm pretty new at this.
Thanks for and advice.
You will struggle to access the Form facade if you directly autoload that file from composer.json. A better solution is to add your macros in app/macros.php and then add the following line in app/start/globals.php:
require app_path().'/macros.php';
Documented under Start Files here: http://laravel.com/docs/lifecycle
Doesn't have an explicit place, just like on app level form macros doesn't have a dedicated file to place them in.
Just make sure you load the file where you register the macro.
You could even put it inside the ServiceProvider if there is only a single macro we are talking about.
Or autoload from composer.json.
{
"autoload": {
"files": [
"path/to/macros.php"
]
}
}