Zope register external methods using zcml configure - methods

Is it possible to register external methods for Zope using a configure.zcml file or something similar? I'm trying to register external Python scripts (similar to other registry items such as "jsregistry.xml" or "cssregistry.xml" in themes)

No. External Methods are "old tech", pre-dating the Zope Component Architecture by several years.
You can easily add a GenericSetup import step that creates ExternalMethod objects on demand, but since only python modules located in the Extensions directories (inside Products and the INSTANCE_HOME location, you may as well just enumerate those locations via the usual Python file-access methods, add everything you find there and not use a registry at all.
However, are you absolutely certain you want to use an ExternalMethod? Wouldn't a simple utility or view be easier?

Related

How to include new idl files and compiling them in Redhawk sdr source code

I want to place files a.idl, b.idl in the folder at the link https://github.com/RedhawkSDR/framework-core/tree/master/src/idl/ossie/CF
And I also include a.idl and b.idl in the makefile at this link
https://github.com/RedhawkSDR/framework-core/tree/master/src/idl
As is done for all other idl file mentioned above.
But these are not compiled as I am not able to see them anywhere.
Please provide any inputs
In addition to including, a.idl and b.idl in the file "Makefile.am" at this link https://github.com/RedhawkSDR/framework-core/tree/master/src/idl , we have to do the following in the "Makefile.am" present at the link https://github.com/RedhawkSDR/framework-core/tree/master/src/base/framework/idl
Add aSK.cpp, aDynSK.cpp, bSK.cpp, bDynSK.cpp to the "BUILT_SOURCES" variable defined in the file.
With this done, now we can see the skeleton and stub codes in the folders at following links :
https://github.com/RedhawkSDR/framework-core/tree/master/src/base/framework/idl
and
this folder which will be generated on running the install command "RedhawkSDR/framework-core/tree/master/src/base/include/ossie/CF/"
REDHAWK's IDL is split into two main categories: core services and ports. Core services are related to REDHAWK's core functionality, like deploying an application. Ports are application-specific interfaces for communicating between different processing stages (components or devices). Core services are not meant to be extended, while ports are meant to be extended by the user beyond those already provided (see https://redhawksdr.github.io/2.2.4/manual/connections/)
New IDL can be added to a REDHAWK instance by creating custom IDL interfaces (https://redhawksdr.github.io/2.2.4/manual/connections/custom-idl-interfaces/)

Google Cloud Logs Export Names

Is there a way to configure the names of the files exported from Logging?
Currently the file exported includes colons. This are invalid characters as a path element in hadoop, so PySpark for instance cannot read these files. Obviously the easy solution is to rename the files, but this interferes with syncing.
Is there a way to configure the names or change them to no include colons? Any other solutions are appreciated. Thanks!
https://github.com/apache/hadoop/blob/trunk/hadoop-common-project/hadoop-common/src/site/markdown/filesystem/introduction.md
At this time, there is no way to change the naming convention when exporting log files as this process is automated on the backend.
If you would like to request to have this feature available in GCP, I would suggest creating a PIT. This page allows you to report bugs and request new features to be implemented within GCP.

Is it possible to configure properties like jcr:PrimaryType from a maven install

I'm following the steps from the Adobe instructions on How to Build AEM Projects using Maven and I'm not seeing how to populate or configure the meta data for the contents.
I can edit and configure the actual files, but when I push the zip file to the CQ instance, the installed component has a jcr:primaryType of nt:folder and the item I'm trying to duplicate has a jcr:primaryType of cq:Component (as well as many other properties). So is there a way to populate that data without needing to manual interact with CQ?
I'm very new to AEM, so it's entirely possible I've overlooked something very simple.
Yes, this is possible to configure JCR node types without manually changing with CQ.
Make sure you have .content.xml file in component folder and it contains correct jcr:primaryType ( e.g. jcr:primaryType="cq:Component").
This file contains metadata for mapping JCR node on File System.
For beginners it may be useful take a look VLT, that used for import/export JCR on File System. Actually component's files in your project should be similar to VLT component export result from JCR.

Is there a way to import/export tasks from different CQ instances?

I have two instances of CQ and between them I want to be able to import/export tasks.
For example:
On instance 1 I can see all tasks by going to http://instance1/libs/cq/taskmanagement/content/taskmanager.html#/tasks/Delta
On instance 2 I can see all tasks by going to http://instance2/libs/cq/taskmanagement/content/taskmanager.html#/tasks/Delta
There might be some scenarios where I want to take all tasks from instance2 and add them as additional tasks into instance1 (on top of the tasks it may already have).
Is this possible to do?
Yes, you can do this with Package Manager. The tasks are stored as nodes in the JCR repository, so you can create a package that filters the task nodes you want to migrate from one instance to another. For example, you could define a package with this filter definition to include all tasks:
/etc/taskmanagement/tasks
If you don't want all tasks, you may need to define the filter(s) more narrowly to pick only the ones you want to include.
For example:
/etc/taskmanagement/tasks/2015-05-04/Delta/TheTaskYouWantToMigrate
Use the browser when defining the filter to find the tasks you want to include.
See Working with Packages for details on using the Package Manager. This Tutorial also shows how to create the package and add filters. Once you've created a package with the filters for the tasks you want to include, then build the package and download it. On your other instance upload the package you built and install it. You will then see the tasks one your first instance replicated onto the second instance.
Additionally to what Shawn said, you also can use replication mechanisms to do the work for you, and replicate the desired nodes between any two instances.

Creating dtrace probes for plugins using single provider name

Note that this is for Mac OS X, although I imagine my problem would exist on any dtrace-capable OS.
I have an app that utilizes a lot of plugins. I'm adding userland probes to it, in both the core application and in the plugins themselves. The issue is that if I use the same provider name in the plugins that the main app is using, those probes aren't showing up when I attempt to create a list of available probes. It appears that whoever's code that loads first wins.
my .d file in my main app:
provider MyApp {
probe doSomething();
};
and in my plugin:
provider MyApp {
probe plugin_doSomethingPluginish();
};
Changing the name of the provider to something else, like MyAppPlugin, works, but then the list of providers is going to get insane (MyAppPlugin1, MyAppPlugin2, etc). I'd like to think that there's a way to add in new plugin-defined probes under the same provider name as the main app, but I'm either not seeing it or it doesn't exist.
So is there a way to do this? And if not, is it normal to have a different provider for each plugin even though the module name is already unique? Seems like that's what the module name is for...
You should just be defining one provider.d file and then importing the .h file into each class using any of those probes, there is really no reason to do multiple .d files each listing the same provider. I just checked in the DTrace documentation about this and don't see anything about it right off the bat, but yeah I would presume that multiple .d files each defining the same provider creates some sort of conflict or that loading probe listing for the same provider is like redefining the probe listing and not extending it as you probably intended.

Resources