MS Access 2010 subreports run fine, but main report freezes, only for user, not for dev. Why, and how to fix? - windows-7

I am using MS Access 2010 on Windows 7, in a moderately locked down corporate environment. I have developed a report that calls several subreports, and one subform, which in turn has several embedded graphs. The various subentities pull data from multiple queries that build on other queries, some of which are parameterized. The two parameters (Year and Month values) are taken from a single form that includes buttons to preview and print the report. All data tables are local to the .accdb file; there is no server back end.
I developed the file locally, then copied it to a shared network folder and tested. Everything works as expected for me -- the report takes a few seconds to run, as the data work is admittedly a bit clunky, but it still displays in a timely fashion. I am in California, the shared folder is somewhere in the Midwest or South, and my coworker is in Texas. I asked coworker to open the file, enter values in the form, and run the report. He got a warning saying the file was read-only, but the form still ran, but the report locked up. Specifically, the report starts to run, the progress bar moves to about 1/3rd or 2/5ths across, then just stops. There is no error message given.
I had coworker force-close Access, deleted the lock file, gave coworker full permissions on the file, and had him try again. Same results, minus the read-only error. Entering values in the form and running the report directly, rather than using the buttons, gives the same result. I had him go through the other database objects, and all of the tables, base queries, subreports, and subform open "instantly", with the expected data and record counts. For some reason, though, bringing it all together just crashes Access every time.
Coworker was doing a screen share for the initial tests, so I know he was following directions correctly. We tried it without, in case the bandwidth was an issue, with no improvement.
I've tried searching variations on "ms access report locks/freezes/hangs" via Google and SO, and got a bunch of stuff about record-locking, which I think is not the issue here. If anyone can suggest better search terms, I'm game to try them.
The data used is confidential, so I'm reluctant to post code, but can work on sanitized versions if that's necessary for a solution.

Since the file is locked in Read-Only, any calculations that are done which in any way write data to a table are not going to work. I suggest you double check the security permissions the user has to the network folder. That includes both the NTFS security and the shared folder security.
He needs write permission to the entire network folder (not just the database file) because Access will try to create the laccdb file and write to it within the network folder. If he does not have write permissions on that folder access will not be able to edit the laccd file.

Related

TFS Check-in oddity causes file to become a different file entirely

I'm trying to figure out the root cause of a strange TFS error we are seeing in our current instance. It wasn't noticed until after a server move, but I'm not sure if they're directly related, because the error seems to be showing up for check-ins about a week prior to the move, as well as all those following it.
We first noticed the problem when I tried to get latest, and got several errors indicating:
"The downloaded file is corrupt. Please get the file again."
Upon looking into the error, we have noticed that starting as of a single check-in every code update has resulted in files being replaced with the contents of other files, ranging from project files to binary executable files (presumably assembly DLLs), rather than the expected content which is still present on our local development machines.
I don't have admin access to the servers myself, but am looking for ideas on possible causes and/or fixes for our team to investigate.
After weeks of searching, I finally found another mention of this sort of thing happening, along with a solution that appears to have worked.
Clear the Application Tier cache.
MSDN Archived Forums: TFS swapping contents of files

Can a script be too new for a google sheet?

I work on a google sheet that has several departments looking at and/or adding data to all day long every day. I have been working on making scripts to make my departments life a lot easier.I created an exact duplicate of the sheet so I could make sure it works before executing new scripts.
I have one that sets up an order, sends an email and puts it on the calendar all in one click. It works great.
In the email we need to send a link to a job folder. So we have a script to find that folder and get the link to it.
var folder = DriveApp.getFoldersByName("12345 - Help me")
var in = folder.next()
var link = in.getUrl()
In my testing grounds this works exactly how it should. When I put it into the actual sheet that we work in I get an error
"Error Exception: We're sorry, a server error occurred. Please wait a bit and try again."
I have been trying to figure it out for 4 days so far and am getting nowhere.
I had the "owner" of the sheet transfer it into my ownership incase that was the problem.
I moved it to a shared drive.
Made a copy of the whole spreadsheet to test it; it worked in the copy just fine.
To change over to a new spreadsheet will be a lot of work that would have to take place after work hours when no one should be using it. I am hoping there is a way to refresh the spreadsheet in such a way that we need to reapprove scripts (or something). The spreadsheet in question was created in 2018. I am wondering if its just to old for the script; not that that makes any sense but, I cant think of anything else.
Thoughts?
From the question
The spreadsheet in question was created in 2018. I am wondering if its just to old for the script; not that that makes any sense but, I cant think of anything else.
Nowadays Google Apps Scripts supports two runtimes, the old (Rhino) and the new (V8). There are posts sharing that changing the runtime to the one or the other fixed an issue. Considering this, the first thing to check if what runtime are being used on each Google Apps Script that are being used as "testing grounds" and in production as sometimes one of the source of "confusions" is to use different runtime.
Another thing to try is to create a standard Google Cloud Platform Project (GCP) to replace the default GCP project. On this project enable the Google Drive API.
Resources
https://developers.google.com/apps-script/guides/v8-runtime
https://developers.google.com/apps-script/guides/support/troubleshooting

Tool for Multiple Code Deployments.

Sorry if a similar question has been posed before. There are a lot of deployment questions but none seemed to address my problem.
Anyway. I'm working with asp.net, C# and using Visual Studio.
The Organization I'm working in is changing rapidly. There are a lot of projects coming in the pipeline that will require multiple code changes and iterative deployments over the next few months. While working, these changes are always 'on the forefront', so sometimes I have to code certain parts of the same program multiple times.
Since these projects are all staggered, I can't just make one sweeping change all at once; I have to deploy and redeploy the same program multiple times, using only the changes that are required for that deployment.
If this is confusing, here's a simple example:
Application is being used on an Intranet. This application calls our Database, using Driver A.
There are two environments, test and production.
Certain Stored procedures have to be called with parameters that register 'Test' to allow certain other applications to run even with bad data (for testing purposes).
When deploying applications, these stored procedures have to be modified, removing Test parameters
We have an Operating System upgrade, allowing us to move to a much faster Driver B, but requires changes to be made to the code to use Driver B.
So that's two wholly different deployments where some code must be changed for Deployment 1 and other code must be changed for Deployment 2.
Currently I'm just using notepad for an overall change list, regular debugging break points and a multitude of in-code comments, and then I manually slog through the code to make sure that everything is changed. With hundreds of thousands of lines of code over multiple files, classes, objects, etc. this gets pretty tedious, as well as there being a good chance of missing something (causing it to break) or pushing wrong changes (causing it to either break or allow bad data).
Is there a tool that could be used to help in this situation? Preferably one that I can discern what needs to change for Deployment A and what needs to change for Deployment B? I'm also open to hearing other schools of thought as well (tips are definitely accepted!)
Sure, I understand your problem.
I would suggest a couple of things
Installers : Why don't you think of installers, there are loads of installers i.e Install shield, Wix, MSI installer.
These installers will give you flexibilty to update files which you need to update, i.e. Full Control.
But you need to choose the best of them, I have worked around MSI and Wix a lot, so I know this can sort your problem, however its your call.
Publish : I haven't played around much with this, I have just done website publish. However I know it does wonders, so try it also.

Crystal Reports 2008 doesn't show database tables / views that I know exist

I have a schema in an Oracle 11g R2 database that I'm trying to connect Crystal Reports.
I have two users; an admin user (where I create the views, etc.) and a reporting user that has the ability to query certain tables/views.
In any other database tool (SQL Developer, TOAD, DB Visualizer), I can see the schema along its tables and views, and can query against them and create new views, etc. as I should be able to.
However, in Crystal Reports 2008, when attempting to access the data, the proper schemas/views aren't displayed. Examples:
Creating an ODBC datasource in Crystal (which I believe connects to one I've pre-created in windows that works just fine), only a small subset of schemas are shown in Crystal (but not the one I should be able to see).
Creating an Oracle datasource in Crystal shows me the schema, and all of the tables I believe, but only one of the views (not the one I need).
NOTE: Normally I would think that it's a permissions issue on the database, except that I can access these schemas/tables/views properly from every other client I've tried.
Any ideas? Is it the drivers that Crystal 2008 uses? Is it still somehow possibly a permissions issue? I'd appreciate any insight you fine folks have.
Looks like this was indeed an error on our DBA's part. A certain level of "select" permissions in their permission model was preventing access. It appears to have been resolved.
But if anyone would like to help me gather all copies of Crystal 2008 in a warehouse and light them on fire, be my guest. :)
I've got a better one...
I was working with this for a long time today, trying to help one of our new developers. He had developed a report from a different workstation against a different data source, and we needed to swap the data source when we transferred it to the new network. Fired up CR, Showed him how to "Set Datasource Location", we get the account information, check the connection string, etc. Get ready to show him how to replace one db w/ another... find the connection, open the server, pop out the databases, open the database to show the tables and... Nothing. Hm...
Try a different account that I know works. Strange, THAT one doesn't see any tables either. Try a different database. OK, now I'm a little off-balance... Remote into the web server to see if I can run one from there. Fire up CR, Open an existing report, hit refresh, put the PW in, and voila! Data. Lots. Copy his report up, remote in, open it, get ready to Set Datasource Location, and ... nothing.
Spoke w/ the DBA, watched/walked him through the check, still nothing.
Funny thing was, if I had a report that had connected before, it would run. Wonderful! Check the available tables... nothing. Quick jump to look at the db... I can see the privileges, I can see everything set fine. Cool. Tried again, nothing.
OK, spoke to another DBA. I walk him through CR to show him the issue, he and I are going to explicitly set permissions. I open the data source in CR, right click to look at Properties, and... noticed that I hadn't check Options. Sinking feeling in the pit of my stomach. Open Options, and notice in the Data Explorer section, TABLES is not checked.
I remember WHY I set it... a long time ago. The DB has thousands of tables, and I knew which ones I needed. I paste a command and go, I never CHOOSE tables.
So... Check TABLES, and thousands of tables show up again. Sigh.
OPEN CRYSTAL REPORT THEN CLICK FILE -> OPTIONS -> SELECT TAB DATABASE -> IN THE EXPLORER OPTIONS PUT TICK MARK ON TABLES AND Onwer Like < add schema name> click ok
this will list only that schema. Crystal Report has some limit is loading all table names so select the scheme so that it will load only that schema.
thanks,
praveen.

Why do the records get deleted in an access database when I open the file in access?

I'm developing a simple database app in visual studio (c# for Windows) using an access backend.
That's all fine until I try to open the database file from within access, when all the reocords get deleted.
Could anyone explain why this is please?
Did you add the database to your solution? Select it and check the Copy Local (aka Copy to Output Directory) setting in the Properties window. Make sure it isn't set to Copy Always,
We need more information. You say that when you open the database from within access, "all the records get deleted". The way the question is phrased implies that some process is running as part of opening the database, e.g., an autoexecute macro. Do you really know that is what is happening? Or are you really just saying that "when I open the database file from access, the records are not there". If the latter, then something is happening along the lines of what cletus suggests.
This is an old question and I don't know if the original poster is still around, but something that didn't occur to me at the time I originally read the question was that perhaps the C# app is using a transaction to insert the data and is not committing it. If that were the case, the data would be visible in the C# app and would not be there when you opened the file in Access. On the other hand, the data wouldn't be there in a new session of the C# app, either, so this might not be the issue.

Resources