How to automate refreshing unread count - lotus

A users workspace contains several chiklets that can often refer to retired servers and throws the error "You are not authorized" if sent a link to a new replica of that application. To fix this issue the user can right click on any open space of the workspace and choose the 'Refresh Unread Count' option which removes any chiklets referring to retired servers. I was able to get a handle on the workspace, desktop8.ndk, but the call to 'GetAllUnreadDocuments" does nothing. I also looked in the Lotus C API 8.5 Reference and found a few entries of which I wasn't sure would solve my problem and checked the 6/7/8/8.5 forums.
So, the question is: can a routine be created that will automate the "Refresh Unread Count" process so that a call can be made to it from a button or during post open of the inbox?
I think the company Panagenda has a process that will remove old servers from the workspace, but not sure how they are going about that. I did see that the 'cache.ndk' has a field, $SourceDbPath, that stores the server/path of databases. Maybe parsing that and making a call to the server to open the 'names.nsf' is the way to go. However, the 'cache.ndk' doesn't contain all applications on the user's workspace.
Thanks,
Dwain

As far as I know, the workspace does not use ordinary Notes document structures and there are no published APIs specifically for accessing it. Panagenda has done extensive research to reverse-engineer the data structures and has written their own low-level code to manipulate them.

Related

Is there a way to hand a document around with Power Automate?

I have a workflow that requires me to hand a file around my team and each of my team members needs to do something with this document. They have to do it in a certain order and one after another.
The current solution is that I send an email to the first person with this file and wait until I receive the document back. Then I send the received document to the next person and so on...
I already looked at all the connectors, especially the email with options from the outlook connector and the Approvals Connector look promising.
Getting the file into the workflow and attaching it to an email is easy and I am stuck for quite some hours now on how to get the received file back into the workflow. I should add that in the ideal case the file goes directly back into the workflow without taking the detour through my mailbox.
The is a bunch of commercial solutions out there, e.g. Adobe Sign, but i would really like to solve this without having to upload my files to some other service and rely on an other company (other than microsoft obviously).
I would really appreciate any suggestions on how one could solve this task!
Thanks a lot.
Short Answer
You need to have a shared storage that all members of the process can access, the file should then be opened and updated from there
My recommendation is (if your company teams/365 groups are set up well) to just use a specific folder in your team's SharePoint site (O365 group) that will be accessible via teams, a browser, or any of the applications required.
This can then be done in the approval flow you're playing with, or via one or several approval flows within the context of a BPF.
Those methods:
Approval Flow
Business Process Flow (BPF)
Detail
Shared Storage
This won't be hard to sort out, if the people involved are only a few in a larger team, and the data is sensitive, then create a separate folder and restrict access. Otherwise, you should at least restrict write access, to ensure that only the people involved can modify the file.
As mentioned earlier, the only thing that could hold you back is the company's set up with regard to O365 Groups, Azure (and normal) AD groups, and the literal teams. But it really shouldn't be an issue for this.
If there is bad group infrastructure, then it's all good, you can just lean in to that and make another brand new team in Teams. Once you've done that, find the new O365 Group it creates, and then just manage it all from SharePoint (you can even add a tab in the Team client to manage the process!) to ensure that the permissions are just right.
Approval Flow
Build the logic first. It should be relatively simple:
Person A performs their task, they click to say it's done.
Person B. Etc.
Then you can start worrying about the file, and how it's accessed and from where.
This is by far the easiest way to do things, and allows you to keep things as simple as possible. For the logic just plot it out step by step, then once you have that, take a look at it and see where you can economise it, and either loop elements, or use variables to make it not require the specifics that you begin with.
With any luck, you'll soon have it doing most of the work for you. You can even ensure that copies of the file are made at each stage and are then archived, if you like.
Business Process Flow
This is my preferred option because it will codify the process and you can make things however complicated in the flow(s) themselves, separately.
The BPF will ably show the organisation how your team performs the task, ie. Johnny edits, then Billy edits, then Jenna edits. However at each stage (or for bespoke tasks) you can call on different flows to perform whatever tasks you need performed.
There are positives and negatives to this approach, mainly:
Positive - You can set it up without ANY automation, and you can use it to manage your current manual process.
Positive - Later you can start to instill the automations you need to process what is required.
Negative - This is advanced stuff, and it's not only difficult to learn, but it's difficult to get right. That said, the end result will be amazing.
I want to share my final solution based on Eliot Coles answer and lots of internet research.
Basically I automated my mailbox meaning that I use the outlook connector to send and receive mails and handling the attachments between those.
The flow is triggered manually where the user has to enter the email-adresses of all the recipients and select the file to pass around. Then I store the recipients in an array to be able to loop over them later. Additionally an unique ID is generated to identify the emails belonging to this flow later on.
Next there is a loop over all recipients. The file is send to the first recipient in the array and another loop waits for the recipient to reply to the message before continuing with the next one.
Finally a close look at the "receive-loop". This runs until an email with attachment arrives from the recipient. All emails filtered by the ID generated earlier are reteived and if there is one with attachment, this attachment is stored in the file variable. If no email matched the criteria, it is waited for some time and the mailbox is checked again.
At the very end, I sent an email back to myself with the last received file, as the workflow is finished then.

Aliases removed across multiple classes

We create classes for multiple school districts via googles classroom api. We noticed that on 2017-12-18 we had multiple classes have their Aliases removed (We ended up creating duplicate classes as we use this alias for our unique ID). We use a domain-scoped alias as defined here https://developers.google.com/classroom/reference/rest/v1/courses.aliases
Any ideas? I'll keep this updated as we find more information.
Google found the root cause on their end. This was a google issue.
Note: I did remove some of the messages from google but nothing context related just some specific things about questions related to our company.
Hello *****,
I hope this message finds you well. This is a follow up message
concerning your case with Google Cloud Support.
I would like to inform you that we just received an update from our
Engineering Team regarding the bug that was affecting the Classroom
Aliases. According to the update received, our Engineering Team was
able to identify the root cause of the issue and the issue is now
fully resolved. This means that the aliases will no longer clear when
making update to the courses via API. I possible, we would like to
know if you are still seeing aliases that are deleted or seemingly
disappearing?
According to the information and due to the nature of this bug, our
Engineering Team will not be able to restore the previously deleted
aliases but they are working on creating an alternative solution in
case anyone else encounters this issue in the future. This means that
they will have a way to recover aliases in the future in case they are
deleted by a bug or issue in our end. Please rest assured that this
API issue should not occur again but if it does, we will have a way to
get those aliases restored.
I suggest that you recreate the aliases, test to see if there is any
other issues and let me know if you need any additional help or if you
have any questions regarding the above response. Again, we really
appreciate your patience and continued support while we work to
identify the root cause of this issue.
The best way to reach or API Team is by opening a support case and
wait for us to reply back via email or wait for a call back. We will
normally sent an email requesting the best time and phone number to
contact you back. You can also reply to any of our cases during a
period of 30 days and your will be be reopened and I will be able to
contact you back as soon as possible.
Sincerely,
Google Cloud Support

Any way to use MvcMiniProfiler on windows application? Or is there a sister application?

So I've started using MvcMiniProfiler on our websites and quite like it. We have a Windows Application component/framework that is leveraged by the website and I was wondering if it was possible to use the profiler on that. I'm assuming not, but maybe there is a subcomponent of the code that could be used? I see that there is a way to configure where the results are stored (i.e. Sql Server) so maybe it is close to possible?
We have the following flow:
Website submits job to 'broker' then returns a 'come back later' page.
Broker runs and eventually data in the websites database gets updated by the broker.
Website displays the results.
I'd be great if there was a way I could get the entire workflow profiled. If there is no way/no intentions from the developers to make MvcMiniProfiler available to Windows applications, any recommendations for similar styled profilers?
You could get this working by using SqlServerStorage, there is very little in the code base that heavily depends on ASP.NET, in fact the SQL interceptor is generalized and so it the stack used to capture the traces.
I imagine that a few changes internally need to be made, eg: use Thread.SetData as opposed to HttpContext but they are pretty superficial.
The way you would get this going is by passing the "profiling identity" into the App and then continuing tracking there. Eventually when the user hits the site after it happens, it would show up as little "chiclets" on the left side.
A patch is totally welcome here but it is not something it does in its current version.
(note to future readers, this is probably going to be out of date at some point, if it is please suggest an edit)
Yes, there's a Windows porting of MiniProfiler: http://nootn.github.io/MiniProfiler.Windows/

Calendar integration to Domino (Lotus Notes)?

How do I integrate with a Lotus Notes Domino server? I know there are several versions and the answer would be different for each one, but advice on any version would be great at the moment as I haven't gotten the info on what server it is I'm supposed to integrate with yet. Assume version 6+.
I'm assuming I need to do the integration with the server and not the local Lotus Notes client, but that might not be correct?
I need to both read and write to the calendar appointments of a select number of users.
For instance I should be able to create/update/delete a appointment for a certain user.
The appointments are the only thing I need access to, at the moment I have no need for the mails.
From what I have read on the internet there are no standard interface to do this?
Should I develop a Domino app that does what I want?
Maybe there is a server API that I can use to connect and retrive information?
Hopefully this can be done in c#? If not what is the preferred way? I read something about java and that is doable also.
If you don't have any concrete answers but you have useful links, please post those as comments.
I have used Java and the C++ APIs to read a Domino calendar. Depending on the scenario, a server side solution can run into trouble if you want to do more than read -- the workflow sometimes needs the Notes client. Need to understand more about what you intend to do.
API documentation:
http://www.ibm.com/developerworks/lotus/downloads/toolkits.html
I'd use Java.
Here's Domino Designer help section on Java:
http://publib.boulder.ibm.com/infocenter/domhelp/v8r0/topic/com.ibm.designer.domino.main.doc/H_9_CODING_GUIDELINES_JAVA.html?resultof=%22%6a%61%76%61%22%20
First read Running a Java program section.
Then you'll be interested in Accessing databases link.
Here's example of how to access user's mail db (calendar items are inside mail db in Lotus):
http://publib.boulder.ibm.com/infocenter/domhelp/v8r0/topic/com.ibm.designer.domino.main.doc/H_EXAMPLES_OPENMAIL_METHOD_JAVA.html
GooCalSync (openntf and LotusNotes-Google Calendar Synchronizer (sourceforce) are great examples of how to do this in Java.
The best way to do this without the pain of having to write code is to use ICal. You will enter all sorts of issues with access, reading appointments etc that are best left to Domino to handle.
There are some good documents on the web on ICal support in Domino.
I've done this before for a CRM product (clearc2.com). iCal is easy, but if you want to do more than insert items and actually do a bi-directional sync to the calendars (which are mail databases on a domino server), then I would look at the appendix of the Lotus Notes C API first. There is a section that explains the C&S piece fairly well. You do not need to use the C API to do the work, but it will explain what the many c&s items (fields) are for.
Click here for documentation.
My advice is to keep it simple, e.g. do not try to tackle repeating items (appts/tasks) on the first attempt. And try not to re-use any custom product objects you find in the mail template. These are undocumented Notes classes and can go away anytime. Furthermore, they may not work the same from each point release or even incremental release. The mail template code can be evil.

How would you make an RSS-feeds entries available longer than they're accessible from the source?

My computer at home is set up to automatically download some stuff from RSS feeds (mostly torrents and podcasts). However, I don't always keep this computer on. The sites I subscribe to have a relatively large throughput, so when I turn the computer back on it has no idea what it missed between the the time it was turned off and the latest update.
How would you go about storing the feeds entries for a longer period of time than they're available on the actual sites?
I've checked out Yahoo's pipes and found no such functionality, Google reader can sort of do it, but it requires a manual marking of each item. Magpie RSS for php can do caching, but that's only to avoid retrieving the feed too much not really storing more entries.
I have access to a webserver (LAMP) that's on 24/7, so a solution using a php/mysql would be excellent, any existing web-service would be great too.
I could write my own code to do this, but I'm sure this has to be an issue previously encountered by someone?
What I did:
I wasn't aware you could share an entire tag using Google reader, thanks to Mike Wills for pointing this out.
Once I knew I could do this it was simply a matter of adding the feed to a separate Google account (not to clog up my personal reading list), I also did some selective matching using Yahoo pipes just to get the specific entries I was interested in, this too to minimize the risk that anything would be missed.
It sounds like Google Reader does everything you're wanting. Not sure what you mean by marking individual items--you'd have to do that with any RSS aggregator.
I use Google Reader for my podiobooks.com subscriptions. I add all of the feeds to a tag, in this case podiobooks.com, that I share (but don't share the URL). I then add the RSS feed to iTunes. Example here.
Sounds like you want some sort of service that checks the RSS feed every X minutes, so you can download every single article/item published to the feed while you are "watching" it, rather than only seeing the items displayed on the feed when you go to view it. Do I have that correct?
Instead of coming up with a full-blown software solution, can you just use cron or some other sort of job scheduling on the webserver with whatever solution you are already using to read the feeds and download their content?
Otherwise it sounds like you'll end up coming close to re-writing a full-blown service like Google Reader.
Writing an aggregator for keeping longer history shouldn't be too hard with a good RSS library.

Resources