Filetype error in older version of Power Query - powerquery

I've made a report in pivot table in Excel 365 file by using the Power Query script, whose acquires data data form .xmlx file (previously exported from D365) by a path imported by VBA macro to named cell ("filePath").
The Power Query code starts like that:
let
Ścieżka = Excel.CurrentWorkbook(){[Name="filePath"]}[Content]{0}[Column1],
Źródło = Excel.Workbook(File.Contents(#"Ścieżka"), null, true),
AxTable1_Table = Źródło{[Item="AxTable1",Kind="Table"]}[Data],
...
Next i have some filtering and column manipulaion. I cannot share file in cause of using the business data of standard production costs lines.
The problem is that despite perfect working on PCs with Office 365, there is no way to propery runn this file on a Excell 2016. What i get is a popup window with text by second step in executing the let lines:
[DataFormat.Error] The input couldn't be recognized as a valid Excel document.
Is there a differences in loading files between these two releases of Power Query and some guidelines to ensure the enquiry working on both?
Step by step, I excluded errors with the VBA macro. I checked the file operation on four different computers (2x D365, 2x Office 2016) with the same source data. The error is reproducible in the same software version.
I need to find a way to ensure data retrieval compatibility in different versions of Power Query.

Related

You cannot import data to this record because the record was updated in Microsoft Dynamics 365 after it was exported

I'm having a strange issue with exporting/updating/importing data in our on-premises Dynamics 365 (8.2). I was doing a bulk update of over 3000 records by exporting the records to an Excel workbook, updating the data in a specific column, then importing the workbook back into CRM. It worked for all of the records except 14 of them, which according to the import log was for the reason that "You cannot import data to this record because the record was updated in Microsoft Dynamics 365 after it was exported." I looked at the Audit History of those 14 records, and find that they have not been modified in any way for a good two months. Strangely, the modified date of the most recent Audit History entry for ALL 14 records is the exact same date/time.
We have a custom workflow that runs once every 24 hours on a schedule that automatically updates the Age field of our Contact records based on the value in the respective Birthday field. For these 14 records, ALL of them have a birthday of November 3rd, but in different years. What that means though is that the last modification that was done to them was on 11/3/2019 via the workflow. However, I cannot understand why the system "thinks" that this should prevent a data update/import.
I am happy to provide any additional information that I may have forgotten to mention here. Can anyone help me, please?
While I was not able to discover why the records would not update, I was able to resolve the issue. Before I share what I did to update the records, I will try and list as many things as I can remember that I tried that did not work:
I reworked my Advanced Find query that I was using to export the records that needed updated to return ONLY those records that had actual updates. Previously, I used a more forgiving query that returned about 30 or so records, even though I knew that only 14 of them had new data to import. I did so because the query was easier to construct, and it was no big deal to remove the "extra" records from the workbook before uploading it for import. I would write a VLOOKUP for the 30-something records, and remove the columns for which the VLOOKUP didn't find a value in my dataset, leaving me with the 14 that did have new data. After getting the error a few times, I started to ensure that I only exported the 14 records that needed to be updated. However, I still got the error when trying to import.
I tried formatting the (Do Not Modify) Modified On column in the exported workbook to match the date format in the import window. On export of the records, Excel was formatting this column as m/d/yyyy h:mm while the import window with the details on each successful and failed import showed this column in mm/dd/yyyy hh:mm:ss format. I thought maybe if I matched the format in Excel to the import window format it might allow the records to import. It did not.
I tried using some Checksum verification tool to ensure that the value in the (Do Not Modify) Checksum column in the workbook wasn't being written incorrectly or in an invalid format. While the tool I used didn't actually give me much useful information, it did recognize that the values were checksum hashes, so I supposed that was helpful enough for my purposes.
I tried switching my browser from the new Edge browser (the one that uses Chromium) to just IE as suggested on the thread provided by Arun. However, it did not resolve the issue.
What ended up working in the end was Arun's suggestion to just do some arbitrary edit to all the records and exporting them afterward. This was okay to do for just 14 records, but I'm still slightly vexed as this wouldn't really be a feasible solution of it were, say, a thousand records that were not importing. There was no field that ALL 14 Contact records had in common that I could just bulk edit, and bulk edit back again. What I ended up doing was finding a text field on the Contact Form that did not have any value in it for any of the records, putting something in that field, then going to each record in turn and removing the value (since I don't know of a way to "blank out" or clear a text field while bulk editing. Again, this was okay for such a small number of records, but if it were to happen on a larger number, I would have to come up with an easier way to bulk edit and then bulk "restore" the records. Thanks to Arun for the helpful insights, and for taking the time to answer. It is highly appreciated!
When you first do an import of an entity (contacts for example) you see that your imported excel contains 3 hidden columns (Do Not Modify) Contact, (Do Not Modify) Row Checksum, (Do Not Modify) Modified On.
When you want to create new instances of the entity, just edit the records and clear the content of the 3 hidden colums.
This error will happen when there is a checksum difference or rowversion differs from the exported record vs the record in database.
Try to do some dummy edit for those affected records & try to export/reimport again.
I could think of two reasons - either the datetime format confusing the system :( or the the community thread explains a weird scenario.
Apparently when importing the file, amending and then saving as a different file type alters the spreadsheet's parameters.
I hence used Internet Explorer since when importing the file, the system asks the user to save as a different format. I added .xlsx at the end to save it as the required format. I amended the file and imported it back to CRM..It worked
For me it turned out to be a different CRM time zone setting for the exporter and importer. Unfortunately this setting doesn't seem to be able to be changed by an administrator via the user interface.
The setting is available for each user under File->Options->Time Zone.

Need multiple copies of the same report in Crystal Report

I am stuck with a peculiar problem. Here is my situation. I have an invoice designed in Crystal Report. I want to provide a feature wherein the user can print multiple copies. The statutory requirement is that every copy will have different title (for e.g. 1st copy may have "Original", 2nd may have "Duplicate for Transporter", etc.) The invoice already has 2 sub-reports (1st for the items and 2nd for tax details). Now I am stuck up as to how do I get multiple copies of the same report with different title. Tried using sub-report but according to crystal report, one cannot use sub-reports within sub-reports.
Request all to please help me some idea. I little new to crystal report. Thanks to all in advance.
My environment is VS 2010, Crystal Report v13, SQL Server 2008, .NET 4.0.
If all data on your reports are the same and only the title is changed, you can set a textObject variable for your title on VB.
Here's how
Dim T As CrystalDecisions.CrystalReports.Engine.TextObject
T = cryRpt.ReportDefinition.Sections(1).ReportObjects("yourTextObjectFromXtalReport")
T.Text = "Your Title"
All you have to do is to set conditions on your program if you want to have multiple reports.
You can also print crystal reports programmatically by following this.
You can create a parameter Title and set the value each time you are printing the report. Drag the parameter to the report in order to have it printed

Duplicate dataset etc. error in rdlc report

I keep getting an unhelpful error while trying to create a subreport using Visual Studio 2005. The error is:
More than one data set, data region, or grouping in the report has the name ‘Factor_StoreTrak_StoreTrakEntities_POS_PollingResultsDetailDTO’. Data set, data region, and grouping names must be unique within a report. C:\Development\WindowsApps\ReportTesting\Reports\PollingResultsSubreport.rdlc
Now, I keep having the report partially done and compiling and running just fine but when I add a new column or somehow change it I suddenly get this error. I then delete every control off my report one by one trying to recompile after each deletion and this error is always thrown. I delete the report and start from scratch only to have it happen again at some random point.
If you need to change the report after the referenced assembly is recompiled / versioned, just open the rdlc file (report file) with Notepad.exe (see that is just an XML file), find the DataSets section and simply delete the older data set, save the changes and reopen the report in the IDE.
If the new dataset is not yet in the file, simply drag a field from the datasources panel onto your report somewhere, this will create a new dataset in the report.
In my case the problem was that when I inserted a row group, it gave it the name "matrix1_RowGroup4", even though there already was a "matrix1_RowGroup4" in the report. It this complained, "More than one data set, data region, or grouping in the report has the name 'matrix1_RowGroup4'."
What I did to solve it was:
Opened the XML file ([myreportname].rdl) - I didn't do it in Notepad, but in the Visual Studio IDE.
Searched for "matrix1_RowGroup4"; as indicated by the err msg, there were two
Incremented the name of the second/newer one to "matrix1_RowGroup5"
Naturally, SSRS being what it is, the Preview still wouldn't display at first for some bogus reason (dataset couldn't be found or some such); I went to the Data tab, and refreshed the fields, and then it was okay.

Using parameters in reports for VIsual Studio 2008

This is my first attempt to create a Visual Studio 2008 report using parameters. I have created the dataset and the report. If I run it with a hard-coded filter on a column the report runs fine. When I change the filter to '?' I keep getting this error:
No overload for method 'Fill' takes '1' argument
Obviously I am missing some way to connect the parameter on the dataset to a report parameter. I have defined a report parameter using the Report/Report Parameter screen. But how does that report parameter get tied to the dataset table parameter? Is there a special naming convention for the parameter?
I have Googled this a half dozen times and read the msdn documentation but the examples all seem to use a different approach (like creating a SQL query rather then a table based dataset) or entering the parameter name as "=Parameters!name.value" but I can't figure out where to do that. One msdn example suggestted I needed to create some C# code using a SetParameters() method to make the connection. Is that how it is done?
If anyone can recommend a good walk-through I'd appreciate it.
Edit:
After more reading it appears I don't need report parameters at all. I am simply trying to add a parameter to the database query. So I would create a text box on the form, get the user's input, then apply that parameter programmatically to the fill() argument list. The report parameter on the other hand is an ad-hoc value generally entered by a user that you want to appear on the report. But there is no relationship between report parameters and query/dataset parameters. Is that correct?
My last assumption appears to be correct. After 30 years in the industry my bias is to assume a report parameter actually filters the SQL data using the given parameter. This is not the case with .rdlc files used by Report Viewer. These report parameters have nothing to do with fetching data. Sounds like this was a design decision on Microsoft's part to completely separate the display of data from the fetching of data, hence, Report Viewer has no knowledge of how data may be fetched. Best way for me to conceptualize this dichotomy is to think of Report Parameters more as Report Labels, quite distinct from the dataset query parameters.

How do I grep (search) a Crystal report for all uses of a column?

I am trying to remove all references to a table from a Crystal XI report. Crystal is telling me that a column from that table is currently being used, because there is a little green check mark over the field in the field viewer. Also, if I try to remove the entire table, I get a warning. The warning is almost useless though because it doesn't tell me where the field is used. Now, back when programmers were real programmers, and mice were things cats chased, I could just grep a directory or file and find all references to a variable I was interested in. But how do I do this in Crystal? I have already tried exporting the report to a Report Definition, which helped find some instances of the troublesome field. Unfortunately, that format does not include all formulas, just some. Please tell me I don't have to buy a third party app (or write my own COM thingy) just to do this seemingly simple thing.
EDIT to add details about tangential point:
In case anyone is wondering, I am not crazy - I have duplicated the issue where a formula's definition does not show up in the exported Report Definition. I created a new blank report, created one formula named stealth that returns 1234. I then used that formula in the Section Expert for the details section, in the "suppress" formula, setting it to {#stealth} == 0. the use of the formula shows up, but not the definition. So when my unwanted column was used in the formula, I was not be able to find it! Here's what the rpt def looks like (after deleting some blank lines):
Crystal Report Professional v11.0 (32-bit) - Report Definition
1.0 File Information
Report File:
Version: 11.0
2.0 Record Sort Fields
3.0 Group Sort Fields
4.0 Formulas
4.1 Record Selection Formula
4.2 Group Selection Formula
4.3 Other Formulas
5.0 Sectional Information
5.1 Page Header Section
Visible, Keep Together
5.2 Page Footer Section
Visible, New Page After, Keep Together, Print At Bottom of Page
5.3 Report Header Section
Visible, New Page Before
5.4 Report Footer Section
Visible, New Page After
5.5 Details Section
Visible
Subsection.1
Visible, Keep Together
Format Formulas
Visible: {#stealth}= 0
If all else fails ...
File -> Export -> Export Report, then choose the Report Definition (TXT) option.
That will give you a plain-text representation of every element of the report. You can grep or CTRL-F or (insert search tool of your choice) through that. "Find in Formulas" usually works, but I've had to go the export route a couple of times, for no apparent reason.
Edit: Of course, if I'd bothered to completely read your post, I'd see that you've already done this.
Very curious.
If you right click on the field in Field Explorer and select Find in Formulas, it should bring up a dialog listing all of the places it is being used in formulas. On the left hand side of the dialog is a tree of all the possible places it could be, including oddball places like record selector and page formatting functions. Unfortunately, it does not seem to list running total fields.
EDIT: Oops, all the places it exists is listed at the bottom of the dialog; the tree view is the entire "DOM" of the report.
I know this is an old post, but...
Not knocking the Find in Formulas, it's been saving me today, but i was having trouble finding the last instance of the field. Even after all of the formulas and the droppings on the report were taken care of, I still had one lone use hiding somewhere.
I found it hiding as a Subreport Link. Right click on the Subreport -> "Change Subreport Links..." and there was the culprit. Dropping in this post because I figured someone else might have this problem too.
Fields can also sometimes be hiding within "Record Sort Expert"
Responding to an old post, but ran into a similar issue. I had a group based on the formula I wanted to delete that had a specified order. When I changed the grouping to a different field, the specified order remained. When I removed the specified order, my formula could be deleted.
This was tested on XIr2...
You change the tables datasource through the "set datasource location" dialog. Now, when it goes into the column mapping mode, uncheck match-type and pick a new column that would cause an error in a formula. (i.e if the column you're looking for is a string replace it with a datetime column). Go to the preview and you should get an error box like "A string is required here.", close that error and up pops the offending formula!
One more suggestion. After following a lot of the suggestions here, my report was still telling me the formula was in use. I had to close the report. When I opened it again, the check mark was gone and it let me delete it. This was on Crystal v 11.0.0.1282
In my case the Formula Field happened to be part of an old Running Total Field, which itself was not included in the report. Once I deleted this old Running Total Field I could delete the unused Formula Field.
Very late, but i use CR 2008 (12.3.0.601) and just today (6/16/2015) i am trying to document only the formulae of my report. I knew about exporting the Report Definition, and Finding a Formula in all Formulae. But there are about 50 Formulae. I discovered that the exported Report Definition didn't document all of my Formulae, but I didn't bother to uncover the logic behind that; instead, i plopped all Formulae into a section, then exported the Report Definition. Voila. Of course, i still need to cull all the unnecessary definition elements. But at least i have all Formulae.
So with all the great selections.. I still had one instance hiding from me. I found out where it was by creating a clone of the data table and renaming\deleting the field.
I then used the "Set Database Location" as suggested above to point to my new table. It did error out when it could not find that field but still didn't tell me where it really was (it just said report field).
I did NOT map it and clicked continue which deleted the field from the report. I then mapped it back to the real table and I was good.
In my case, there was a Chart, and the field was being used as one of the "on Change" fields.
Although an old post, this functional gap still exists within Crystal Reports itself. We have a fully functional 14 day trial of our third party software that uses the latest Crystal.net API to search for plain text within a library of Crystal RPT files in one fell swoop. Also searches the data saved within reports, and text within labels ... as well as datasource behind all your reports ( stored procedures, views, and table data ) with support for SQL Server, SQL Azure, MySQL, Oracle, Amazon RDS, DB2 and Access.
More info and trial downloads at http://www.finditez.com
Note, you will need to download and install the compatible SAP Crystal.net runtime connector for searching your RPT file library.

Resources