xmldoc <see> with cref that refers to a class under a separate csproj - xmldocument

We have an SDK project that includes a test engine. These live under two different csproj's. So we have SDK.csproj and TestEngine.csproj where SDK.csproj is a ProjectReference.
I have a DocFx project set up that builds metadata for these two separately, with docfx.json metadata that looks like this:
{
"src": [
{
"src": "../../sdk",
"files": ["csharp/SDK/**/*.cs"]
}],
"dest": "reference/SDK"
},
{
"src": [
{
"src": "../../sdk",
"files": ["csharp/TestEngine/**/*.cs"]
}],
"dest": "reference/TestEngine"
}
This way I can set up TOC's to put these documentation trees under separate tabs.
However, I cannot use a cref in TestEngine xml docs that refers to a class from SDK. I get an error like this from DocFX build:
Warning:[MetadataCommand.ExtractMetadata]Invalid cref value "!:SDK.SDKClass" found in triple-slash-comments for TestEngineClass, ignored.
I can imagine why this fails - DocFX is generating the metadata for TestEngine alone so it doesn't know about the SDK classes or how to link to them. Is there a way I can change the configuration so that I can keep these two projects separate (under separate TOC's) in the final website but still link from TestEngine to SDK classes?

I realized that using # and/or xref tags as described in the DocFx documentation do resolve to links properly in the generated web pages. So that gets me a lot of what I want. However, it's not a complete solution as other generated references do not resolve to links. For example, if a TestEngine method has a parameter of type SDK.SDKClass, the generated docs won't make a link for SDKClass where it appears on the parameter documentation. So I'm still wondering if there is another solution.

Related

Custom Search Parameter for Extensions Resource on FHIR

Does FHIR support search based on Extension values?
I have added this extension under the ImagingStudy Resource
{
"extension": [
{
"url": "http://hl7.org/fhir/SearchParameter/institution-name",
"valueString": "Apollo"
}
]
}
Is it possible to have a custom search parameter added for this extension such that it can be searched accordingly? If possible, how can I register it?
It's definitely possible to define search parameters that look at extensions. There's an example of one here: http://hl7.org/fhir/searchparameter-example-extension.html
However, the process of getting a given server to support those search parameters depends on what server you're using. Some of the reference implementation servers have an ability to generically support any 'normal' SearchParameter that is appropriately registered. Other servers will require custom coding to support new parameters.
Note that having an extension with a canonical URL that looks like a SearchParameter is going to be confusing to most implementers. If you're using a FHIR-based canonical URL, it should be a StructureDefinition.

No matching variant of XXX was found. The consumer was configured to find attribute with value 'platform' and 'version-catalog'

After export versionCatalogs as a library, and being used by other applications, the error state
> No matching variant of io.github.elye:plugin-dependencies:1.0.0 was found. The consumer was configured to find attribute 'org.gradle.category' with value 'platform', attribute 'org.gradle.usage' with value 'version-catalog' but:
- Variant 'apiElements' capability io.github.elye:plugin-dependencies:1.0.0:
- Incompatible because this component declares attribute 'org.gradle.category' with value 'library', attribute 'org.gradle.usage' with value 'java-api' and the consumer needed attribute 'org.gradle.category' with value 'platform', attribute 'org.gradle.usage' with value 'version-catalog'
- Variant 'runtimeElements' capability io.github.elye:plugin-dependencies:1.0.0:
- Incompatible because this component declares attribute 'org.gradle.category' with value 'library', attribute 'org.gradle.usage' with value 'java-runtime' and the consumer needed attribute 'org.gradle.category' with value 'platform', attribute 'org.gradle.usage' with value 'version-catalog'
My versionCatalogs library is writen as below
plugins {
id 'java-library'
id 'kotlin'
id 'version-catalog'
}
java {
sourceCompatibility = JavaVersion.VERSION_1_7
targetCompatibility = JavaVersion.VERSION_1_7
}
catalog {
// declare the aliases, bundles and versions in this block
versionCatalog {
alias('androidx-core').to('androidx.core:core-ktx:1.6.0')
alias('androidx-appcompat').to('androidx.appcompat:appcompat:1.3.1')
alias('androidx-constraintlayout').to('androidx.constraintlayout:constraintlayout:2.1.0')
alias('android-material').to('com.google.android.material:material:1.4.0')
bundle('androidx', ['androidx-core',
'androidx-appcompat',
'androidx-constraintlayout'])
}
}
ext {
PUBLISH_GROUP_ID = 'io.github.elye'
PUBLISH_VERSION = '1.0.0'
PUBLISH_ARTIFACT_ID = 'plugin-dependencies'
}
apply from: "./publish-module.gradle"
When I try to access the versionCatalog library, it is just as below
dependencyResolutionManagement {
// Some other codes
versionCatalogs {
xlibs {
from("io.github.elye:plugin-dependencies:1.0.0")
}
}
}
How can I fix the error?
Looks like I just need to remove the below from the library's build.gradle file.
plugins {
id 'java-library'
id 'kotlin'
}
Hence the code looks like below
plugins {
id 'version-catalog'
}
catalog {
// declare the aliases, bundles and versions in this block
versionCatalog {
alias('androidx-core').to('androidx.core:core-ktx:1.6.0')
alias('androidx-appcompat').to('androidx.appcompat:appcompat:1.3.1')
alias('androidx-constraintlayout').to('androidx.constraintlayout:constraintlayout:2.1.0')
alias('android-material').to('com.google.android.material:material:1.4.0')
bundle('androidx', ['androidx-core',
'androidx-appcompat',
'androidx-constraintlayout'])
}
}
ext {
PUBLISH_GROUP_ID = 'io.github.elye'
PUBLISH_VERSION = '1.0.0'
PUBLISH_ARTIFACT_ID = 'plugin-dependencies'
}
apply from: "./publish-module.gradle"
I didn't quite follow the Elye's answer, so I'll explain what I changed in my build.gradle.kts file to get around the problem.
I have a separate project that contains the toml version catalog. That project is simply responsible for publishing the version catalog to my nexus repository. In my case, I not only publish the toml version catalog, but I also publish 4 properties files. Those properties files correspond to the values of the versions, libraries, bundles and plugin sections of the toml version catalog. I do that so my runtime application can get access to the information contained in the toml file. Normally the toml version catalog is only available to the build components. This allows me to make that data available to either runtime components or externally built custom gradle plugins.
Here's what my settings.gradle.kts file contains:
dependencyResolutionManagement {
repositoriesMode.set(RepositoriesMode.FAIL_ON_PROJECT_REPOS)
repositories {
maven {
url = uri(System.getenv()["MyReleaseRepoUrl"].toString())
credentials {
username = System.getenv().get("MyRepoUser").toString()
password = System.getenv().get("MyRepoUserPW").toString()
}
}
google()
mavenCentral()
}
versionCatalogs {
create("libs") {
from(files("src/main/resources/libs.versions.toml"))
}
}
So the settings file creates a version catalog for the toml file which I store in the "src/main/resoures" directory
Here's what my build.gradle.kts file contains:
plugins {
id ("version-catalog")
id ("maven-publish")
kotlin("jvm")
id ("com.xyz.versions.plugin")
}
publishing {
publications {
create<MavenPublication>("VersionCatalog") {
groupId = "com.xyz"
artifactId = "com.xyz.version.catalog"
version = libs.versions.toml.get()
//artifact("src/main/resources/libs.versions.toml")
from(components["versionCatalog"])
}
create<MavenPublication>("VersionsProperties") {
groupId = "com.xyze"
artifactId = "com.xyz.version.versions"
version = libs.versions.toml.get()
artifact("src/main/resources/versions.properties")
}
...
The plugin "version-catalog" is what makes "components["versionCatalog"]" available. Maven-publish is the plugin I use to do the publishing. And "com.xyz.versions.plugin" is a buildSrc plugin that is written in Kotlin. That plugin iterates each section of the version catalog and produces a corresponding properties file.
And lastly you see my definition of what I publish. I show how I publish the toml version catalog, and I show how I publish the versions.properties file that I generated. You should notice that I have the "artifact" line commented out in the publication description for the toml file. If I published the file using the "artifact" line I get the "no matching variant..." issue when I try to use the published version catalog in some other project. If I use the "from components" line instead, I don't get the "no matching variant..." issue when I import the version catalog in another project.
As I explained in the comment to my previous solution, it actually doesn't work. The problem with the code I posted is that it ended up publishing an empty version catalog; and with an empty published version catalog that ends up avoiding the error. But of course, that's not what you want happening. The solution to getting the version catalog published was for me to publish the version catalog using the "artifact" line instead of the "components" line. I don't totally understand the circumstances that cause you to publish an empty version catalog but publishing it as an artifact instead of a component gets around that issue. There are lots of people who have observed this same issue and I think it's a Gradle bug.
With that said, the issue of "no matching variant..." would come up when I tried to use that published version catalog. In my case, it turns out that I was publishing the version catalog reusing the previous version number. I had set up my maven repository to let me do that. What I did to get around the problem was to run "gradlew build --refresh-dependencies" in the project that I was importing the version catalog artifact. When I watched the build, I got the clear sense that the version catalog was be reloaded by watching the build output screen. I let that task run for a while, but the task seemed to be hung. After 10 minutes of waiting, I just went ahead and control C'ed out of the task. I then ran the task "gradlew build" without the refresh dependency option. This time, it ran without producing the "no matching variant...". I also had the build script print out the value of one of the versions and libraries aliases. It printed out the values that was contained in the imported version catalog.
I'd posted two answers and both of these answers at first seemed to resolve this problem, and then days later the problem started reoccurring. I think this time I have a better understanding of what's wrong and I'll try to set the record straight in hopes that this info is useful to someone else experience this problem.
The first thing to understand is that the Gradle documentation on how to publish a version catalog and later on import that published version catalog is close to nonexistent or impossible to find. Like most Gradle documentation, the best you can hope for is some coding examples that shows you what to do; usually you get a coding example with no explanation as to what it doing. When it comes to publishing a version catalog, you'll see examples that show you how to publish using "from(components["versionCatalog']). So, the question is why do you need to publish it as a component? After much searching around, I discovered that beginning in Gradle version 6, the publish of Gradle Module Metadata in now the default when using the "maven-publish" or "ivy-publish" plugins. What's harder to find is that the importing a version catalog from a repository relies on this additional metadata in order to work. This additional metadata gets stored in the repository along with the other required maven files when you publish an artifact to the repository. For most of us, we don't really need to know about these other files that get automatically generated and when you publish an artifact to a repository. The publishing plugin hides that detail from us. When you publish an artifact, you not only add the artifact to the repository, but you also add a POM, maven-metadata, and a number of message digest calculation. There's also a define directory structure as to where these files get store in the repository. And now beginning with Gradle version 6, maven-publish adds the file "module" to the mix.
Here's an example of what that "module" file contains:
{
"formatVersion": "1.1",
"component": {
"group": "com.mycompany",
"module": "com.mycompany.version.catalog",
"version": "0.0.1-SNAPSHOT",
"attributes": {
"org.gradle.status": "integration"
}
},
"createdBy": {
"gradle": {
"version": "7.4"
}
},
"variants": [
{
"name": "versionCatalogElements",
"attributes": {
"org.gradle.category": "platform",
"org.gradle.usage": "version-catalog"
},
"files": [
{
"name": "com.mycompany.version.catalog-0.0.1-SNAPSHOT.toml",
"url": "com.mycompany.version.catalog-0.0.1-SNAPSHOT.toml",
"size": 12663,
"sha512": "0078f8cd00d4a49577bdee7917cafaac0a292761d197012529580c2a561c630360ced5689c07ff40373c32f5e3873b0910987acbc74a659b160c50fb35598b6b",
"sha256": "051977a0ecdccea2e0708fa34d12390f3b4505010e7f00f4390c0c0cc5459e1d",
"sha1": "8ce5ca0cda001dd67799c98a2605107674a3c92b",
"md5": "8f348b9eef2b0ccb7d77d0fe17989f31"
The important thing to notice is the presents of the attributes "org.gradle.catalog" with value "platform" and "org.gradle.usage" with the value "version-catalog".
If you look at Elye's original posted message, those are the attributes that can't be found. So, what the error message is saying that when it tried to import the version catalog artifact from the repository it first retrieved the module file and found that this artifact is not publish as the kind of artifact that it was expecting to find. Instead, the artifact that it did find had the attributes 'org.gradle.category' with value 'library' and the attribute 'org.gradle.usage' with value 'java-api'. The other possibility is that the repository, did not contain a "module" file, and if that's the case, it assumes those attributes and values as the default values. So, what the error message is telling us is that it found an artifact that match the request, but it's not been published as the type of artifact that we were expecting.
So that is what I believe is what's going on, and if that's all you're looking for you can stop here. The message means you've not properly published the version catalog. What I'll now attempt to explain is what happened to me and how I got confused. I'll also explain how I manifested the same problem but from a different angle.
In my situation, I have a project, whose only purpose is to define a version catalog that can be published. I have a number of other external projects that import that published version catalog. I've authored the version catalog so that it contains the version number of the version catalog that it publishes. When I publish the version catalog, I publish it using the version number contained in the version catalog itself. In order to do that, the version catalog needs to be created in the setting.gradle.kts script file. The build.gradle.kts script that that does the publishing get access to the version number using the accessor "libs.version.toml.get()".
The problem is that I can't publish that version catalog using the version catalog in my settings file as a component. Hence that got me publishing it as a file instead of a component. If I publish it as a file and not a component, the "module" file is not published, and that cause the error about mismatched attributes. The only way I could get it to be published as with the module file was to apply the "version-catalog" plugin. That resulted in publishing an empty version catalog. In other words, it didn't know that I had a version catalog created in the setting file. Later I discovered that I could add a "versionCatalog" section to my build.gradle.kts script. In that section I added a catalog that came from the same file that the version catalog in my settings file was using. In other words, I've got two instances of version catalogs both derived for the same file. That works, and when I now publish from component["version-catalog"], I no longer publish an empty version catalog In doing it this way, the module file gets added to the repository. And because I got the version catalog in the settings file, I can extract the version number I need in my build.gradle.kts file. Life is good!
One of things that I was doing that added to the problem was that I was republishing the catalog using the same version number. I would of course refresh my dependencies each time I did this. When I first started this, I was publishing using "components", when I later discovered that the catalog was empty, I started publishing using "files". The problem with that is I believe that a republish using files, didn't cause the "module" file to go away. So with the presents of the "module" file, that masked the problem that I wasn't properly publishing the version catalog. What's funny is that it worked for a couple of days and then the problem started reoccurring. I did look at the contents of my repository, and I saw that the module file was still there. And I recall that it had the right attributes, but after about 3 days, and refreshing my dependencies, the error came back. So they's a bit of a mystery as to why? I couldn't find anything in the maven repository from a data perspective that would signal that the module file shouldn't be associated with the given artifact. The only thing that comes to mind is perhaps the datetime of the module file is not timewise near enough to the datetime of the artifact. I think that may be what's happening, but that's pure speculation. It could also be some weird caching issue. I don't power my machine off every day, and the problem might have emerged after a reboot of my computer. The problem is that how Gradle interacts with the repository in retrieving a version catalog is not describe.
Since this problem, I took the time to learn a little more about maven snapshot artifacts. I had avoided it use only because I was unaware how snapshots are implemented. I've started using snapshots and have learned a little more on how snapshots get added to the repository. I've a good feeling that this is going to straighten out some of the weird things I've been observing.

Only create Nuget package if project has changed

Given a solution containing both a Web Api project, an Api Client Project, and an assembly project containing shared Dto objects as follows:
Solution
Web Api Project
Shared Dto Project
Api Client Project
Unit Tests
I have an Azure Devops build pipeline to build and test all of the projects.
Within that build pipeline, I have a step that creates 2 Nuget packages for the Api Client and the Shared Dto projects.
However, I don't need to create new packages all of the time, say I only added a unit test or fixed a bug in the Web Api project, I don't want to create new Nuget packages (or worry about bumping the Package version) if nothing in the Api Client Project (or its dependencies) has changed, but I do still want to regression test it.
I thought that perhaps I could do this by having multiple build pipelines, one for the Web Api project, and others for the Api Client & Shared projects, then use path filters to trigger the appropriate builds.
Are there any other ways to achieve this?
Whats the best practice?
I don't want to have to maintain more build definitions than I need.
I ended up using a slight modification of
this answer
# Checks which files have been updated to determine if new Nuget packages are required
$editedFiles = git diff HEAD HEAD~ --name-only
echo "$($editedFiles.Length) files modified:"
$editedFiles | ForEach-Object {
echo $_
Switch -Wildcard ($_ ) {
'Whds.Configuration.Models/*' {
# If the Models project is updated, we need to create both packages
Write-Output "##vso[task.setvariable variable=CreateModelsPackage]True"
Write-Output "##vso[task.setvariable variable=CreateClientPackage]True"
}
'Whds.Configuration.Standard/*' { Write-Output "##vso[task.setvariable
variable=CreateClientPackage]True" }
# The rest of your path filters
}
}
This script sets variables which are then referenced in custom conditions in the dotnet pack step in the build pipeline:
and(succeeded(), eq(variables['CreateModelsPackage'], 'True'))
If the Dto project is changed, both variables are set in order to create both packages.
If only the client (aka Standard) project is the only thing that has changed, the package for the Dto project will not be created.
Are there any other ways to achieve this? Whats the best practice? I don't want to have to maintain more build definitions than I need.
There are different ways to achieve it, but not sure which one is the best practice, it all depends on your requirements or tastes.
The simple method is similar to your thought. Also need to create a new build pipeline. The difference is that we do not need to maintain this build definition.
Details:
Add a new pipeline without any more task in this pipeline, and use
path filters to trigger the appropriate builds (Api Client and the
Shared Dto projects).
Add a build completion to your original Azure Devops build pipeline:
Add a custom condition for the step that creates 2 Nuget packages based on the Build.Reason, like:
and(succeeded(), eq(variables['Build.Reason'], 'BuildCompletion'))
Now, the steps to create 2 Nuget packages only executed when the file changes come from a specific project. Of course the limitation of this solution is that if you already have a build completion, it will not work.
If the above method is not what you want, we could invoke the REST API commits to get the commit info for each build:
GET https://dev.azure.com/{organization}/{project}/_apis/git/repositories/{repositoryId}/commits/{commitId}?changeCount=100&api-version=5.1
We could find the changes/path in the returned body:
"changes": [
{
"item": {
"gitObjectType": "blob",
"path": "/.gitattributes",
"url": "https://dev.azure.com/fabrikam/_apis/git/repositories/278d5cd2-584d-4b63-824a-2ba458937249/items/.gitattributes?versionType=Commit"
},
"changeType": "add"
},
{
"item": {
"gitObjectType": "blob",
"path": "/.gitignore",
"url": "https://dev.azure.com/fabrikam/_apis/git/repositories/278d5cd2-584d-4b63-824a-2ba458937249/items/.gitignore?versionType=Commit"
},
"changeType": "add"
},
{
"item": {
"gitObjectType": "tree",
"path": "/MyWebSite",
"isFolder": true,
"url": "https://dev.azure.com/fabrikam/_apis/git/repositories/278d5cd2-584d-4b63-824a-2ba458937249/items/MyWebSite?versionType=Commit"
},
"changeType": "add"
},
Then we could use the powershell script to traverse these paths to see if they include Api Client and the Shared Dto projects, if yes, we set a variables with a value, add condition based on this value for the steps that creates 2 Nuget packages.
Note: Before use the REST API commits, we need use the Commits - Get Commits to get the latest commit Id:
GET https://dev.azure.com/{organization}/{project}/_apis/git/repositories/{repositoryId}/commits?$top=1&api-version=5.1
Hope this helps.

Where and how to record project maintainers

As developer, I want to know if there are best practices to put project maintainers inside a Go project.
In php projects this can be done updating composer.json file with content similar to:
{
"authors": [
{
"name": "Name Surname",
"email": "foo#bar.man",
"role": "Developer"
}
]
}
Is there a standard Go alternative?
No. Go packages do not track the developer(s) or contributor(s) in any prescribed way.
Of course you can always add them to a README, wiki, code comments, or other human-consumable information as you see fit.

In strapi what are api templates for?

I've started playing with strapi to see what it can do.
When generating an api through strapi studio, it generates a set of base files to handle the model and api calls.
In the entity folder (e.g. article), there's a templates/default folder created with a default template. For an article entity, I get a ArticleDefault.template.json file with this:
{
"default": {
"attributes": {
"title": {},
"content": {}
},
"displayedAttribute": "title"
}
}
In strapi studio I also then add additional templates for each entity, given it multiple templates.
The command line api generator does not create the templates folder.
I couldn't find anything about it in the documentation I read.
What are the generated templates for?
When would I use them, and how would I choose a particular template if I have multiple?
I'm one of the authors of Strapi.
A template is like a schema of data. Let’s take a simple example. You have an API called Post, sometimes your post have a title and a content attribute, but other times, your post have a title, a subtitle, a cover and a content attribute. In both cases, we’re talking about the same API Post but your schema of data is different. That’s why we implemented the templates! Your needs could be different for the same content.
Then, as you said the CLI doesn't generate a template folder in project. The Studio doesn't use the same generator as the CLI but the behavior of your API is the same.

Resources