OBIEE 12.2.1.2 COPY data from dashboard - PASTE in excel: wrong order - export-to-excel

when copy pasting data from dashboards (pivot tables; no interactions), sometimes the order in excel is wrong:
I.e.: Dashboard: I have marked a - b ( 1 2 4 5) of a pivot table from left to right and copied them.
Pasting them into Excel sometimes results in:
Marking C B A (from right to left) and pasting into excel would sometimes result in:
Using the regular obiee export button works fine (always).
Ideas and suggestions are much appreciated - this happens with all browsers.

That is so not what you are supposed to do anyways it's not funny anymore. Copy+paste into Excel? Pray tell me why you try to use an analytical system in the first place.
tl;dr : Totally wrong usage of the tool.
Edit: By the way in 11g you couldn't even copy the data that way for security reasons because it's an action that isn't logged anywhere.

Related

You cannot import data to this record because the record was updated in Microsoft Dynamics 365 after it was exported

I'm having a strange issue with exporting/updating/importing data in our on-premises Dynamics 365 (8.2). I was doing a bulk update of over 3000 records by exporting the records to an Excel workbook, updating the data in a specific column, then importing the workbook back into CRM. It worked for all of the records except 14 of them, which according to the import log was for the reason that "You cannot import data to this record because the record was updated in Microsoft Dynamics 365 after it was exported." I looked at the Audit History of those 14 records, and find that they have not been modified in any way for a good two months. Strangely, the modified date of the most recent Audit History entry for ALL 14 records is the exact same date/time.
We have a custom workflow that runs once every 24 hours on a schedule that automatically updates the Age field of our Contact records based on the value in the respective Birthday field. For these 14 records, ALL of them have a birthday of November 3rd, but in different years. What that means though is that the last modification that was done to them was on 11/3/2019 via the workflow. However, I cannot understand why the system "thinks" that this should prevent a data update/import.
I am happy to provide any additional information that I may have forgotten to mention here. Can anyone help me, please?
While I was not able to discover why the records would not update, I was able to resolve the issue. Before I share what I did to update the records, I will try and list as many things as I can remember that I tried that did not work:
I reworked my Advanced Find query that I was using to export the records that needed updated to return ONLY those records that had actual updates. Previously, I used a more forgiving query that returned about 30 or so records, even though I knew that only 14 of them had new data to import. I did so because the query was easier to construct, and it was no big deal to remove the "extra" records from the workbook before uploading it for import. I would write a VLOOKUP for the 30-something records, and remove the columns for which the VLOOKUP didn't find a value in my dataset, leaving me with the 14 that did have new data. After getting the error a few times, I started to ensure that I only exported the 14 records that needed to be updated. However, I still got the error when trying to import.
I tried formatting the (Do Not Modify) Modified On column in the exported workbook to match the date format in the import window. On export of the records, Excel was formatting this column as m/d/yyyy h:mm while the import window with the details on each successful and failed import showed this column in mm/dd/yyyy hh:mm:ss format. I thought maybe if I matched the format in Excel to the import window format it might allow the records to import. It did not.
I tried using some Checksum verification tool to ensure that the value in the (Do Not Modify) Checksum column in the workbook wasn't being written incorrectly or in an invalid format. While the tool I used didn't actually give me much useful information, it did recognize that the values were checksum hashes, so I supposed that was helpful enough for my purposes.
I tried switching my browser from the new Edge browser (the one that uses Chromium) to just IE as suggested on the thread provided by Arun. However, it did not resolve the issue.
What ended up working in the end was Arun's suggestion to just do some arbitrary edit to all the records and exporting them afterward. This was okay to do for just 14 records, but I'm still slightly vexed as this wouldn't really be a feasible solution of it were, say, a thousand records that were not importing. There was no field that ALL 14 Contact records had in common that I could just bulk edit, and bulk edit back again. What I ended up doing was finding a text field on the Contact Form that did not have any value in it for any of the records, putting something in that field, then going to each record in turn and removing the value (since I don't know of a way to "blank out" or clear a text field while bulk editing. Again, this was okay for such a small number of records, but if it were to happen on a larger number, I would have to come up with an easier way to bulk edit and then bulk "restore" the records. Thanks to Arun for the helpful insights, and for taking the time to answer. It is highly appreciated!
When you first do an import of an entity (contacts for example) you see that your imported excel contains 3 hidden columns (Do Not Modify) Contact, (Do Not Modify) Row Checksum, (Do Not Modify) Modified On.
When you want to create new instances of the entity, just edit the records and clear the content of the 3 hidden colums.
This error will happen when there is a checksum difference or rowversion differs from the exported record vs the record in database.
Try to do some dummy edit for those affected records & try to export/reimport again.
I could think of two reasons - either the datetime format confusing the system :( or the the community thread explains a weird scenario.
Apparently when importing the file, amending and then saving as a different file type alters the spreadsheet's parameters.
I hence used Internet Explorer since when importing the file, the system asks the user to save as a different format. I added .xlsx at the end to save it as the required format. I amended the file and imported it back to CRM..It worked
For me it turned out to be a different CRM time zone setting for the exporter and importer. Unfortunately this setting doesn't seem to be able to be changed by an administrator via the user interface.
The setting is available for each user under File->Options->Time Zone.

Interactive Grid- Automatic Row Processing (DML) : Character limitation in Code editor

I have a requirement where 300 columns had to be processed. I am trying to achieve this using IG automatic row processing (DML). When writing the code in the editor I get a error stating 'Value too long by 2015 characters'.
I suppose this is an Oracle Apex limitation. Can someone please share their views on this?
When writing the code in the editor ...
I'd say that your problem isn't related to number of columns, but a large query which can't fit into "SQL Query" item of the Page Designer.
Which Apex version do you use? I can't tell for sure (as I don't know it), but my impression is that Apex up to version 4.2 had that item limited to VARCHAR2(4000) so - if your query is larger than that, it won't fit (such as in your case - query you wrote is 2015 characters longer than the maximum size the item allows). In 5.x version, you can put a whole lot of query into the item (as if it was modified to a CLOB).
Now, as you use Interactive Grid and it appeared in 5.x version, huh ... maybe what I wrote above isn't entirely true. Unfortunately, you can't switch to a query whose source is a function that returns query (such as in Classic Reports), as you could write a (stored) function and simply call it from Apex.
As you said that you used automatic row processing, did you put too much code somewhere in there?
On the other hand, I Googled a little bit, looking for limit of column numbers in the IG - couldn't find anything official, but someone complained (here, on StackOveflow) that they tried to create an IG with over 100 columns, and it didn't work.
So, yes - maybe you hit the limit, but I can't confirm it. Hopefully, someone who knows Apex better will be able to assist. Alternatively, consider asking the same question on OTN forums, as people who designed Apex answer questions there.

Is it possible to reverse a column transformation in Spotfire, and if not, what are the alternatives?

I've made the mistake of using the 'Calculate and Replace Column' feature to replace the wrong column, and realized after the fact. The column I replaced corresponds to last names and is important. I would like to retrieve this column but maintain my other 15 or so data transformations. Ideally, I would like to remove this transformation, but I've come up empty so far. Here's what I've tried:
I tried adding the 'last name' column again from the same external source, using >Insert >Columns... I also tried renaming this column to avoid the data transformation. Unfortunately, this resulted in an entirely empty column, so it did not successfully match to the table or was affected by the transformation..
I checked the source information, and found exactly the 3-4 lines that I wish were not there. I thought it might be possible to edit this but haven't found a way. This seems like it would be the easiest.
Another idea I had was I could replace the data table with the same source, and repeat all of the transformations from the replace data table dialogue (excluding the bad one). This is my next plan of attack, but I figured I would come on here to see if there's an easier way first.
Thanks in advance!
Good News for YOU!!! #jeremyVollen.
It is possible to 'edit' your transformation per Tibco article 44098.
Resolution: If there are more then one transformations on a data table and you need to edit any of those transformation, follow the steps below:
Go To Edit >> Data Table Properties.
Select the desired data table inside which the transformation has been added and click on Refresh Data > With Prompt.
A new window will pop up which will allow you to make the desired changes in each of the transformations.
unfortunately it is NOT possible to reverse data table transformations.
it IS possible to undo the transformations with Edit>>Undo or CTRL+Z, but that's as far as it goes.
my strategy for dealing with this is (in accordance with your #3) to visit Edit>>Data Table Properties, select the table I'm interested in, select Source Information, then copy the contents of the textarea and paste it into notepad. then, I'll File>>Replace Data Table and start over from the beginning while keeping the notepad open so I don't miss any steps.
I realize it's not ideal, but there is unfortunately not another way.

Linq equivalent of SQL LEFT function?

We have a database with some fields that are varchar(max) which could contain lots of text however I have a situation where I only want to select the first for example 300 characters from the field for a paginated table of results on a MVC web site for a "preview" of the field.
for a simplified example query where I want to get all locations to display in the table
(this would be paginated, so I don't just get everything - I get maybe 10 results at a time):
return db.locations;
However this gives me a location object with all the fields containing the massive amounts of text which is very time consuming to execute.
So what I resorted to before was using SQL stored procedures with the:
LEFT(field, 300)
to resolve this issue and then in the Linq to SQL .dbml file included the stored procedure to return a "location" object for the result.
However I have many queries and I don't want to have to do this for every query.
This maybe a simple solution, but I am not sure how I can phrase this on a search engine, I would appreciate anyone who can help me with this problem.
You can use functions that directly translate to those functions too, this is useful when you need to translate code that functionally works just fine in SQL at no risk in LINQ.
Have a look at System.Data.Objects.EntityFunctions
Locations.Select(loc=>System.Data.Objects.EntityFunctions.Left(loc.Field,300))
This will get directly translated into a LEFT on the server side.
EDIT: I misread LEFT for LTRIM. Here's all the String functions that can't be used in LINQ to SQL. Have you tried String.Substring()?
Your best option is to map the stored procedure and continue using it. Here is an excellent article with screen shots showing you how to do so.
If you're not using the designer tool you can also call ExecuteCommand against the DataContext. It isn't pretty, but it's what we have for now.
I found something like this worked for me:
return from locationPart in db.locations
select new LocationPart
{
Description = locationPart.description,
Text = locationPart.text.Substring(0,300)
};
Not ideal because I have to use "select new" to return a a different object, but it seems to work.

How do I grep (search) a Crystal report for all uses of a column?

I am trying to remove all references to a table from a Crystal XI report. Crystal is telling me that a column from that table is currently being used, because there is a little green check mark over the field in the field viewer. Also, if I try to remove the entire table, I get a warning. The warning is almost useless though because it doesn't tell me where the field is used. Now, back when programmers were real programmers, and mice were things cats chased, I could just grep a directory or file and find all references to a variable I was interested in. But how do I do this in Crystal? I have already tried exporting the report to a Report Definition, which helped find some instances of the troublesome field. Unfortunately, that format does not include all formulas, just some. Please tell me I don't have to buy a third party app (or write my own COM thingy) just to do this seemingly simple thing.
EDIT to add details about tangential point:
In case anyone is wondering, I am not crazy - I have duplicated the issue where a formula's definition does not show up in the exported Report Definition. I created a new blank report, created one formula named stealth that returns 1234. I then used that formula in the Section Expert for the details section, in the "suppress" formula, setting it to {#stealth} == 0. the use of the formula shows up, but not the definition. So when my unwanted column was used in the formula, I was not be able to find it! Here's what the rpt def looks like (after deleting some blank lines):
Crystal Report Professional v11.0 (32-bit) - Report Definition
1.0 File Information
Report File:
Version: 11.0
2.0 Record Sort Fields
3.0 Group Sort Fields
4.0 Formulas
4.1 Record Selection Formula
4.2 Group Selection Formula
4.3 Other Formulas
5.0 Sectional Information
5.1 Page Header Section
Visible, Keep Together
5.2 Page Footer Section
Visible, New Page After, Keep Together, Print At Bottom of Page
5.3 Report Header Section
Visible, New Page Before
5.4 Report Footer Section
Visible, New Page After
5.5 Details Section
Visible
Subsection.1
Visible, Keep Together
Format Formulas
Visible: {#stealth}= 0
If all else fails ...
File -> Export -> Export Report, then choose the Report Definition (TXT) option.
That will give you a plain-text representation of every element of the report. You can grep or CTRL-F or (insert search tool of your choice) through that. "Find in Formulas" usually works, but I've had to go the export route a couple of times, for no apparent reason.
Edit: Of course, if I'd bothered to completely read your post, I'd see that you've already done this.
Very curious.
If you right click on the field in Field Explorer and select Find in Formulas, it should bring up a dialog listing all of the places it is being used in formulas. On the left hand side of the dialog is a tree of all the possible places it could be, including oddball places like record selector and page formatting functions. Unfortunately, it does not seem to list running total fields.
EDIT: Oops, all the places it exists is listed at the bottom of the dialog; the tree view is the entire "DOM" of the report.
I know this is an old post, but...
Not knocking the Find in Formulas, it's been saving me today, but i was having trouble finding the last instance of the field. Even after all of the formulas and the droppings on the report were taken care of, I still had one lone use hiding somewhere.
I found it hiding as a Subreport Link. Right click on the Subreport -> "Change Subreport Links..." and there was the culprit. Dropping in this post because I figured someone else might have this problem too.
Fields can also sometimes be hiding within "Record Sort Expert"
Responding to an old post, but ran into a similar issue. I had a group based on the formula I wanted to delete that had a specified order. When I changed the grouping to a different field, the specified order remained. When I removed the specified order, my formula could be deleted.
This was tested on XIr2...
You change the tables datasource through the "set datasource location" dialog. Now, when it goes into the column mapping mode, uncheck match-type and pick a new column that would cause an error in a formula. (i.e if the column you're looking for is a string replace it with a datetime column). Go to the preview and you should get an error box like "A string is required here.", close that error and up pops the offending formula!
One more suggestion. After following a lot of the suggestions here, my report was still telling me the formula was in use. I had to close the report. When I opened it again, the check mark was gone and it let me delete it. This was on Crystal v 11.0.0.1282
In my case the Formula Field happened to be part of an old Running Total Field, which itself was not included in the report. Once I deleted this old Running Total Field I could delete the unused Formula Field.
Very late, but i use CR 2008 (12.3.0.601) and just today (6/16/2015) i am trying to document only the formulae of my report. I knew about exporting the Report Definition, and Finding a Formula in all Formulae. But there are about 50 Formulae. I discovered that the exported Report Definition didn't document all of my Formulae, but I didn't bother to uncover the logic behind that; instead, i plopped all Formulae into a section, then exported the Report Definition. Voila. Of course, i still need to cull all the unnecessary definition elements. But at least i have all Formulae.
So with all the great selections.. I still had one instance hiding from me. I found out where it was by creating a clone of the data table and renaming\deleting the field.
I then used the "Set Database Location" as suggested above to point to my new table. It did error out when it could not find that field but still didn't tell me where it really was (it just said report field).
I did NOT map it and clicked continue which deleted the field from the report. I then mapped it back to the real table and I was good.
In my case, there was a Chart, and the field was being used as one of the "on Change" fields.
Although an old post, this functional gap still exists within Crystal Reports itself. We have a fully functional 14 day trial of our third party software that uses the latest Crystal.net API to search for plain text within a library of Crystal RPT files in one fell swoop. Also searches the data saved within reports, and text within labels ... as well as datasource behind all your reports ( stored procedures, views, and table data ) with support for SQL Server, SQL Azure, MySQL, Oracle, Amazon RDS, DB2 and Access.
More info and trial downloads at http://www.finditez.com
Note, you will need to download and install the compatible SAP Crystal.net runtime connector for searching your RPT file library.

Resources