Is there a way to avoid providing the implementation of external functions in Go? For example, in C you can compile the code into a static library and provide a header containing only the external API of your library for the user. In Go, however, even if I compile the code into a static library, the only way to use the library is by importing the package containing all the exported functions with their implementations.
So, the question is: In golang, is there a way to provide only an API to the user along with a binary file, like a static lib, and let the user use the API without providing him any kind of implementation code?
Thanks
Related
I am using buf to generate grpc server and client code for several languages (go, python, js, java, c#), while using grpc-ecosystem/plugins/openapiv2 plugin to generate swagger documentation from the same proto files.
In some files I'm using custom
option (grpc.gateway.protoc_gen_openapiv2.options.openapiv2_tag) = {description: "Manage datasets and examples used for training."}; to add additional metadata to the documentation. This requires me to import annotations.proto from grpc-gateway project which causes the imports to also appear in generated source files. Now languages like go for example can handle this by using import for side effects
import (
_ "github.com/grpc-ecosystem/grpc-gateway/v2/protoc-gen-openapiv2/options"
_ "google.golang.org/genproto/googleapis/api/annotations"
)
but in java and c# there are some lines that are being generated which look like this
registry.add(com.google.api.AnnotationsProto.http);
registry.add(grpc.gateway.protoc_gen_openapiv2.options.Annotations.openapiv2Tag);
which causes compilation errors, because package grpc.gateway... does not exist (I was able to import the googleapis package via Maven and nuget). When I remove the options from .proto files there are no issues and I can compile the source files to a package for distribution. Is there any way to exclude these imports from generated code?
I have tried separating the documentation to its own files, but it's impossible to do with metadata which are part of Service or Message definitions as I'm getting duplicate definition errors.
Since there is no official Java library that corresponds to that annotations.proto file, you need to generate your own or do some shenanigans to modify the compiled descriptors before generating the Java code. I'll explain both.
You can generate Java code for annotations.proto, not just your own proto that imports it.
If you use Buf, you can actually tell it to generate source code for your imports using a --include-imports flag to buf generate (though this generates sources for all imports, not just ones that don't otherwise have a corresponding Java library).
These annotations are available in Buf's Schema Registry, so you could also separately generate these files and compile them into a separate JAR using buf generate buf.build/grpc-ecosystem/grpc-gateway.
One issue with this file is that it does not declare a Java package option in the file. That's why the Java package in the generated code doesn't have a proper reverse domain name. With Buf, you could use managed mode to actually inject a Java package option, to generate these files into whatever package you want (so you could generate them as if they were in a "shaded" package in your own JAR).
This second route is much less advised and not for the faint of heart, but it lets you omit the import in the generated code. You need to first compile your sources to a file descriptor set. (Buf can produce this via buf build; With protoc, use a -o option instead of --java_out.) This file is a binary encoded FileDescriptorSet.
You could write something that reads this file (unmarshalling its contents into a FileDescriptorSet) and then modifies it. You'd modify it by examining the dependency field of every file in the set and removing the entries like "protoc-gen-openapiv2/options/annotations.proto".
You can then re-marshal this to a file and feed that back in for code generation. So instead of generating Java code from sources, you'd generate them from the modified descriptor set (buf generate <file>#format=bin if using Buf or protoc --descriptor_set_in=<file> --java_out=<output-dir> if using protoc).
Note that this approach can only work if the only things that use the import being removed are custom options. That's because custom options can safely be represented as unrecognized fields in the descriptor (and effectively ignored). If you remove an import that has type definitions that are referenced in the file, the compiler will not accept the modified file descriptor set.
If that last bullet looks like Greek, it's because that is quite advanced descriptor-fiddling. Realistically, I think the first bullet is your best approach.
I'm trying to get my head around the workings of an xAPI package authored in Rise which has been supplied to me so I can build a test PoC app.
I can see the functions built into the index.html page, and that things like progress and quiz scores are genrated, but where do I find the end-point for a LRS within the package?
I have incorporated the package into a test app I built, but rather than generate and send statements myself I would like use what comes as part of the package.
If I import and play the package in SCORM Cloud, I get generated statements returned.
The only thing I can see is an entry in the tincan.js file, this.recordStores=[] other than that I'm unsure where to go next, any suggestions?
Generally this kind of package implements a set of guidelines that were released with the 0.9 version of the specification (at the time named the Tin Can API and then later changed to xAPI). Those guidelines provide for a packaging and launch mechanism which is what Rise has implemented. The launch mechanism indicates that the endpoint and authentication credentials will be passed on the query string to the launched content where it can retrieve them. The TinCanJS library used by Rise implements functionality to digest the query string and set up objects, those you find in this.recordStores to communicate with the xAPI LRS identified in the query string parameters.
You have two primary options,
Get the query string parameters directly from the launch URL and
process it yourself, potentially using the same global library objects
(TinCan.LRS) already available to get an LRS object that you can
then interact with as you see fit,
Leverage the object already created for you via the this.recordStores list that is already prepared by the package itself
There are pros/cons to both methods and they largely depend on your familiarity with JavaScript and how flexible you need/want to be.
I'm using the SwiftAutomation framework to drive a scriptable app that searches for lyrics and returns a AS record. Everything was working correctly, until...
I mapped the AppleScript record to a custom Swift structure according to the SwiftAutomation documentation. The code in the xxxGlue.swift file looks correct, but the compiler complains about SwiftAutomation.SelfUnpacking, with several follow-on errors, when building the MacOSGlues framework.
public struct LFBLyricsInfoRecord: SwiftAutomation.SelfPacking, SwiftAutomation.SelfUnpacking { ... }
--> .../MacOSGlues/LyricsFBAGlue.swift:700:81: No type named 'SelfUnpacking' in module 'SwiftAutomation'
The SelfPacking public protocol is defined in SwiftAutomation, and SelfUnpacking protocol is defined right under it, but without the public keyword. Is that the cause of the compiler error, and if so, how do I fix it?
OK, I finally found a resolution. Seems you have to use different options for the aeglue utility when generating the glue file for the MacOSGlues framework and for the swift file where you actually use your automation, such as in the test project. In my case, where my scriptable app is named LyricsFBA.app, these were:
aeglue -S LyricsFBA.app
for MacOSGlues (generates a LyricsFBAGlue.swift that references SwiftAutomation, but does not include the custom record structure definition), and
aeglue -D -s 'LyricsInfo:lyricsInfo=score:Int+title:String+artist:String+composer:String+link:String+lyrics:String' LyricsFBA.app
for the test program (generatea a LyricsFBAGlue.swift that does not reference SwiftAutomation, but does include the custom record structure definition).
Our qtwebkit-based application is rejected by apple after submission to mac app store. One of the reasons for rejection is the usage of non-public API. I've managed found six of them in the source code for qtwebkit. But I have no where to find the rest. I searched through the source code of our application and the entire source code of QT.
The six non-public api I found in qtwebkit source is:
CFHTTPCookieStorageSetCookieAcceptPolicy
CFURLCacheCopyResponseForRequest
CFURLResponseGetMIMEType
CFURLResponseCopySuggestedFilename
CFURLCacheSetMemoryCapacity
CFURLCacheSetDiskCapacity
Here is the full list of violations found by apple:
The use of non-public APIs can lead to a poor user experience should
these APIs change in the future, and is therefore not permitted. The
following non-public APIs are included in your application:
NSAccessibilityCreateAXUIElementRef
NSAccessibilityHandleFocusChanged
NSAccessibilityUnregisterUniqueIdForUIElement NSAppKitPropertyCreator
NSCarbonWindowPropertyTag NSMouseMovedNotification
_NSDrawCarbonThemeBezel _NSDrawCarbonThemeListBox _NSPopUpCarbonMenu3 _NXShowKeyAndMain from the framework: '/System/Library/Frameworks/AppKit.framework/Versions/C/AppKit'
AXTextMarkerCreate AXTextMarkerGetBytePtr AXTextMarkerGetLength
AXTextMarkerGetTypeID AXTextMarkerRangeCopyEndMarker
AXTextMarkerRangeCopyStartMarker AXTextMarkerRangeCreate
AXTextMarkerRangeGetTypeID CTLineCreateWithUniCharProvider
CoreDragGetCurrentDrag CoreDragSetImage from the framework:
'/System/Library/Frameworks/ApplicationServices.framework/Versions/A/ApplicationServices'
GetNativeWindowFromWindowRef TSMGetInputSourceProperty from the
framework:
'/System/Library/Frameworks/Carbon.framework/Versions/A/Carbon'
CFReadStreamSignalEvent _CFAppVersionCheckLessThan
_CFBundleSetDefaultLocalization _CFStringGetUserDefaultEncoding from the framework:
'/System/Library/Frameworks/CoreFoundation.framework/Versions/A/CoreFoundation'
CFHTTPCookieStorageCopyCookiesForURL CFHTTPCookieStorageDeleteCookie
CFHTTPCookieStorageGetCookieAcceptPolicy
CFHTTPCookieStorageSetCookieAcceptPolicy
CFHTTPCookieStorageSetCookies CFURLCacheCopyResponseForRequest
CFURLCacheSetDiskCapacity CFURLCacheSetMemoryCapacity
CFURLRequestCreateMutableCopy CFURLResponseCopySuggestedFilename
CFURLResponseGetExpectedContentLength CFURLResponseGetHTTPResponse
CFURLResponseGetMIMEType CFURLResponseGetURL
CFURLResponseSetExpectedContentLength CFURLResponseSetMIMEType
_CFNetworkHTTPConnectionCacheGetLimit _CFNetworkHTTPConnectionCacheSetLimit _CFURLCacheCopyCacheDirectory _CFURLRequestCreateArchiveList _CFURLRequestCreateFromArchiveList _CFURLResponseCreateArchiveList _CFURLResponseCreateFromArchiveList _CFURLResponseGetSSLCertificateContext _LSGetCurrentApplicationASN _LSSetApplicationInformationItem _kLSDisplayNameKey kCFStreamPropertyCONNECTAdditionalHeaders
kCFStreamPropertyCONNECTProxy kCFStreamPropertyCONNECTProxyHost
kCFStreamPropertyCONNECTProxyPort kCFStreamPropertyCONNECTResponse
kCFURLResponseExpectedContentLengthUnknown from the framework:
'/System/Library/Frameworks/CoreServices.framework/Versions/A/CoreServices'
NSPopAutoreleasePool NSPushAutoreleasePool from the framework:
'/System/Library/Frameworks/Foundation.framework/Versions/C/Foundation'
CARenderCGDestroy CARenderCGNew CARenderCGRender
CARenderNotificationAddObserver CARenderNotificationRemoveObserver
CARenderServerGetPort CARenderServerStart CARenderUpdateAddContext
CARenderUpdateAddRect CARenderUpdateBegin CARenderUpdateFinish
kCAContextPortNumber from the framework:
'/System/Library/Frameworks/QuartzCore.framework/Versions/A/QuartzCore'
If you have defined methods in your source code with the same names as
the above-mentioned APIs, we suggest altering your method names so
that they no longer collide with Apple's private APIs to avoid your
application being flagged in future submissions.
Additionally, one or more of the above-mentioned APIs may reside in a
library included with your application. If you do not have access to
the library's source, you may be able to search the compiled binary
using "strings" or "otool" command line tools. The "strings" tool can
output a list of the methods that the library calls and "otool -ov"
will output the Objective-C class structures and their defined
methods. These techniques can help you narrow down where the
problematic code resides.
I've finally traced down to the location where these so-called privatate apis are called. They are called from inside webkit. webkit uses a webkit system interface library which is directly supplied from apple in the format of compiled static library+header file. More specifically, they are the four files located under the path src\3rdparty\webkit\WebKitLibraries:
libWebKitSystemInterfaceLeopard.a
libWebKitSystemInterfaceLion.a
libWebKitSystemInterfaceMountainLion.a
libWebKitSystemInterfaceSnowLeopard.a
I always wondered if it's really a private api, who else would know how to call it without any documentation? Now it's turned out to be apple itself. Since neither nokia or digia has the source code to these libraries, there is probably nothing they can do about it.
Now isn't it ironic that any qtwebkit-based apps will be rejected by apple due to private api access from libraries created?
Please correct me if I am wrong or miss anything. I really hope I am wrong.
As I'm writing a Firefox XUL Extension I find that I want to share some functionality (the business logic) across the whole extension. What would be the best place to store this?
Can I create some sort of library (javascript) file which always gets loaded first?
You most likely want to create a JavaScript code module. You can use Components.utils.import() to load it:
Components.utils.import("chrome://myaddon/content/utils.jsm");
And in utils.jsm you define which symbols should be imported by that statement, e.g.:
var EXPORTED_SYMBOLS = ["Utils"];
var Utils = {
};
The module will be loaded when it is first used and stay in memory after that - there will be only a single module instance no matter how many places on your extension use it. Note that I used a chrome:// URL to load the module, this is supported starting with Firefox 4. Documentation recommends using resource:// URLs which is cleaner because modules don't actually have anything to do with the user interface - still, using a chrome:// URL is often simpler.