How to make a copy with all the structure/schema from an existing table in FoxPro?
copy to
won't work
There is no single command in VFP, that would copy with everything that an existing table might have. You have to define your requirement and what does your existing table have well.
For example, for simple copy with data you can use:
copy to NewTableName with cdx [Database dbName [Name tableNameInNewDb]]
This would copy the structure, along with indexes and existing data. However, it wouldn't copy other database level properties if any. You need to use cursorgetprop()\cursorsetprop(), dbgetprop()\dbsetprop() for others.
One way to do a complete copy, is first to get a programmatic version of your database and tables creation using (home()+'tools\gendbc\gendbc.prg') and then edit the generated code for your new table (and also to append the code from the old one).
Wish you initially explained us what do you meant by saying "copy to won't work".
PS: I noticed your question title is saying "make a blank copy". Then you might want to use:
copy to NewTableName with cdx for .F.
It would copy the structure and indexes but not the data. Other shortcomings still apply. If you didn't need the indexes then a simplier way would be:
select * from sourceTable where .F. into table targetTable
EDIT: As LAK pointed out, using WHILE instead of FOR would be faster on large tables:
copy to NewTableName with cdx while .F.
It's
copy structure to newfile
or, alternatively,
copy to newfile for .f.
Related
I have a filegroup Named (Year2020) which contains There different .ndf files, for example Summer.ndf Winter.ndf, Fall.ndf.
Now I want to create a Fall table and I want my table to be saved in Fall.ndf file not on Summer.ndf not on Winter.ndf Is there a way to do things like this? I am using SQL Server.
The problem is all are in the same filegroup Named year2020....how can we save it exactly where we want ??
When I save the fall table it goes into summer.ndf not on Fall.ndf
I have a query that reads data from table 1. however, some people when they use my sheet, they delete the whole table before they paste their extracted data, so power query breaks .. how to avoid that? .. because multiple people use this sheet and mostly they delete everything in the source tab before they paste their data.
I have a metadata Name as CONTACTS(SOURCE.CSV|TAGET.CSV). Now I read this file using reader and populate the value in table that I created as CONTACT_TABLE(PK NUMBER, Source_name varchar2(500),target_name varchar2(500)) after that I want to read these source.csv and target.csv file stored in my table CONTACT_TABLE AND populate the value in other table called SOURCE_COLUMN_TARGET_COLUMN_TABLE(PK,FK as pk of contact_table,source_column,target_column) this table should contain all the column of source and target and should have one to one relationship with that, for example, source.csv(fn)-----target.csv(firstName)
My objective is whenever we add some other attribute in source or target I should not change the entire mapping for eg if we add source.csv(email) and target.csv(email) it should directly map
Thanks!
please help!
I have this task completed before Friday and I searched every source I found dynamic mapping thing and parameter thing but it was not very helpful I want to do this way itself
Not clear what you are asking actually. The source analyser uses source files(.csv) on import itself and thereby contains the same format in source qualifier.
So, if any of the values gets added into your existing files (source.csv, target.csv) then it becomes a new file for your existing mapping. hence, you dont need to change the whole mapping just that you need to import it again.
I've made the mistake of using the 'Calculate and Replace Column' feature to replace the wrong column, and realized after the fact. The column I replaced corresponds to last names and is important. I would like to retrieve this column but maintain my other 15 or so data transformations. Ideally, I would like to remove this transformation, but I've come up empty so far. Here's what I've tried:
I tried adding the 'last name' column again from the same external source, using >Insert >Columns... I also tried renaming this column to avoid the data transformation. Unfortunately, this resulted in an entirely empty column, so it did not successfully match to the table or was affected by the transformation..
I checked the source information, and found exactly the 3-4 lines that I wish were not there. I thought it might be possible to edit this but haven't found a way. This seems like it would be the easiest.
Another idea I had was I could replace the data table with the same source, and repeat all of the transformations from the replace data table dialogue (excluding the bad one). This is my next plan of attack, but I figured I would come on here to see if there's an easier way first.
Thanks in advance!
Good News for YOU!!! #jeremyVollen.
It is possible to 'edit' your transformation per Tibco article 44098.
Resolution: If there are more then one transformations on a data table and you need to edit any of those transformation, follow the steps below:
Go To Edit >> Data Table Properties.
Select the desired data table inside which the transformation has been added and click on Refresh Data > With Prompt.
A new window will pop up which will allow you to make the desired changes in each of the transformations.
unfortunately it is NOT possible to reverse data table transformations.
it IS possible to undo the transformations with Edit>>Undo or CTRL+Z, but that's as far as it goes.
my strategy for dealing with this is (in accordance with your #3) to visit Edit>>Data Table Properties, select the table I'm interested in, select Source Information, then copy the contents of the textarea and paste it into notepad. then, I'll File>>Replace Data Table and start over from the beginning while keeping the notepad open so I don't miss any steps.
I realize it's not ideal, but there is unfortunately not another way.
I'm trying to write an update sql statement in postgresql (pg commander) that will update a user profile image column
I've tried this:
update mytable set avatarImg = pg_read_file('/Users/myUser/profile.png')::bytea where userid=5;
got ERROR: absolute path not allowed
Read the file in the client.
Escape the contents as bytea.
Insert into database as normal.
(Elaborating on Richard's correct but terse answer; his should be marked as correct):
pg_read_file is really only intended as an administrative tool, and per the manual:
The functions shown in Table 9-72 provide native access to files on the machine hosting the server. Only files within the database cluster directory and the log_directory can be accessed.
Even if that restriction didn't apply, using pg_read_file would be incorrect; you'd have to use pg_read_binary_file. You can't just read text and cast to bytea like that.
The path restrictions mean that you must read the file using the client application as Richard says. Read the file from the client, set it as a bytea placement parameter in your SQL, and send the query.
Alternately, you could use lo_import to read the server-side file in as a binary large object, then read that as bytea and delete the binary large object.
pg_read_file can read the files only from the data directory path, if you would like to know your data directory path use:
SHOW data_directory;
For example it will show,
/var/lib/postgresql/data
Copy you file to the directory mentioned.
After the you can use only file name in your query.
UPDATE student_card SET student_image = pg_read_file('up.jpg')::bytea;
or can use pg_read_binary_file function.
UPDATE student_card SET student_image = pg_read_binary_file('up.jpg')::bytea;