JavaScript rule extension - sonarqube

I am trying to extend Sonarqube with custom Javascript rules. I find that the documentation is fairly limited on this subject. The extension tutorials on sonarqube website show only the most basic stuff.
The only javadoc I could find is this one: http://javadocs.sonarsource.org/latest/apidocs/ and it doesn't cover anything about extending Javascript.
What I ultimately want to do is add a JS rule that will check for hardcoded secrets (such as passwords, api keys, etc). I already created one for Java, and that was a lot easier as I could take an already pre-made plugin and complete it with my custom regex.
The problematic spot that made me post here was actually this one:
cannot find symbol
symbol: class VariableTree
location: package org.sonar.plugins.javascript.api.tree.expression
I was following the same scheme as with Java and used
import org.sonar.plugins.javascript.api.tree.expression.VariableTree;
which is obviously wrong. I was not able to find the source code for this either... if anybody can point me to some secret doc stash or at least where I can find a javadoc for org.sonar.plugins.javascript.api that would be amazing!
Thanks very much for any help

Related

What do protofiles represent for googles API Linter?

since google did not create extensive documentation for their API Linter and I cannot find anything from other sources, i wanted to ask here.
From what exactly am i supposed to create protofiles and what do they represent?
As I understood, these protofiles get checked for compliance to their AIPs.
I'm interested in creating a automatic prototype (Java) to check for customized API Rules and am thinking about using Protobufer for this goal. Would this be a pragmatic solution?
Thank you!
As I now understand, Googles API Linter isn't supposed to check a API or a different API specification for compliance.
It checks proto files because they are themself the API specification and can be converted to Code. But before converting them they can be checked against rules they have to comply with.
These rules are not in the protofiles (as I initially thought) but in the many GO-files under rules/ .
Please correct me if you read this and find mistakes! Thanks!

How can I produce github annotations by creating report files on disk?

I am trying to find a portable way to produce code annotations for GitHub in a way that would avoid a vendor-lockin.
Mainly I want to dump annotations inside a file (yaml, json,...) during build process and have a task at the end that does transform this file into github annotations.
The main goal here is to avoid hardcoding support for github-annotation into the tools that produce them, so other CI/CD systems could also consume the annotation-reports and display them in their UI.
linters -> annotations.report -> github-upload
Tools like flake8 are able to produce output in parsable format file:line:column: message, but I need to know if there is any attempt to standardize annotations so we can collect and combine them from multiple tools and feed them to the CI/CD engine.
Today I googled up what the heck those "Github Action Annotations" are all, and this was among the hits:
https://github.com/marketplace/actions/annotations-action
GitHub action for creating annotations from JSON file
As of now that page also contains:
This repository uses npm packages from #attest scope on github; we are working hard to open source these packages.
Annotations Action is not certified by GitHub. It is provided by a third-party and is governed by separate terms of service, privacy policy, and support documentation.
I didn't try it, again, just a random google hit.
I am currently using https://github.com/yuzutech/annotations-action
Sample action code:
- name: Annotate
uses: yuzutech/annotations-action#v0.3.0
with:
repo-token: ${{secrets.GITHUB_TOKEN}}
input: ./annotations.json
title: 'Findings'
ignore-missing-file: true
It does its job well but with one minor defect. If you have a findings on a commit/PR you get to see the finding with a beautiful annotation right where you need it. If you re-push changes, even if the finding persists, the annotation is not displayed on later commits. I have opened an issue but I have not yet received an answer.
The annotations-action mentioned above has not been updated and it does not work with me at all (deprecated calls).
I haven't found anything else that worked exactly as I wanted it to.
Update: I found that you can use reviewdog to annotate based on findings. I also created a GitHub action that can be used for Static Code Analysis here https://github.com/tsigouris007/action-semgrep-reviewdog. You can visit the entrypoint.sh file and check how I piped the custom output to reviewdog utilizing jq.

Wagtail alongside Django Rest Framework drf-yasg?

I am implementing a Wagtail powered blog within a larger (primarily DRF) driven app. I'm attempting to use drf-yasg for my documentation.
Since installing wagtail, the docs are now throwing
'Request' object has no attribute 'wagtailapi_router'
It looks to be related to the introspection that drf-yasg does, and all I can find about excluding views from drf-yasg is done at the code level. Being an installed module obviously I want to avoid that.
Has anyone got these 2 (3) components playing nicely together?
It's been a very long time since you asked this question, but as I found this while looking for an answer myself, I thought I might share what worked for me.
Note that I'm not using drf-yasg, but rather DRF's own schema generator. They do however have a lot in common.
The problem in my case was that the schema generator URL was defined like this:
path(
"schema/",
get_schema_view(title="My API Schema"),
name="openapi-schema",
),
What I needed to add was a patterns= argument that referenced my API specifically, leaving out the other non-API urls (like Wagtail):
path(
"v3/schema/",
get_schema_view(title="My API Schema", patterns=router.urls),
name="openapi-schema",
),
I hope that helps... someone :-D

Using "check" package causes another package to error

I'm using the Check package to validate parameters passed to Meteor methods. And I'm using Audit argument checks to enforce this.
However, I've added another package, Meteor Tags and when I try to use methods from the Tags package, I get a server error "Exception while invoking method '/patterns/addTag' Error: Did not check() all arguments during call to '/patterns/addTag'".
I think I understand why this error happens - the method in the Tags package doesn't check its inputs, so Audit Argument Checks generates an error. But I can't find any way around this, apart from 1) don't enforce checking, or 2) hack the Tags package methods so they use check. Neither of these seems like a great option - checking server parameters is a good idea, and hacking a package is not very maintainable.
Does anybody know if there is any smart way to use 'Audit argument checks' with packages that provide new server methods? I have looked at the Check documents, and searched online, but I haven't found an answer.
I hope this question makes sense.
Using audit-argument-checks is like saying: "I want to be serious about the security of the methods in my app." It's global to all methods in your app's codebase, including the methods from your installed packages.
There is no way to specify which parts of the app get checked, as that would be the equivalent of saying: "I want to be serious about the security of the methods I've written, but I don't care about the security holes created by some pacakges" (which doesn't make a lot of sense).
Note to package authors
Check your method arguments. It's not hard, and it prevents this situation from happening. Frankly, a package without this basic security really shouldn't be installed in the first place.
What you should do
Unless you have a throwaway app, I wouldn't recommend removing audit-argument-checks. Instead I'd do the following (assuming the package really has something of value):
Open an issue on github and let the maintainer know what's up.
Fork the code, and add the required checks. Keep this version as a local package.
Submit a pull request for the changes.
If all goes well, your PR will be accepted and everyone can benefit from the change. In the worst case, you'll still have a local copy that you can use in your app.

How can I avoid mirror effect of image in coverflow

I have downloaded a coverflow sample from the link
http://www.macresearch.org/cocoa-tutorial-image-kit-cover-flow-and-quicklook-doing-things-we-shouldnt-are-too-fun-resist.
I need open flow effect but I dont need miror effect of images.Is it possible to do it.Is there any API available in IKImageFlowView class.
Any help would be appreciated.
There is no API available for IKImageFlowView. It is a private class. That's why the blog post is titled "doing things we shouldn't." If you look at the project in the blog post, you will find a reverse-engineered IKImageFlowView.h. That's as much information as is available. You can use class-dump as noted in the blog post and see if you can find the IKImageFlowViewDelegate protocol if there is one (this class appears to take a delegate). That might allow you to configure it.
Note that Apple may change this class at any time.
You are probably better off using a third-party implementation like MBCoverFlowView or OpenFlow.

Resources