I followed this articel store fast report template , to implement it but I cant save and load the report because the stream is empty !!!
Finally I figure out the problem , it is related to the SQLSERVER real available space to store in DB for varbinary field as my FASTREPORT template size was byte[616054] so it just save 50 byte in sql , check this link varbinarymax,
So , I changed the data type to the ntext and convert stream to string for storing and do the reverse for retrieve mechanism
this command byte to string
byte[] blob = stream.ToArray();
string st= System.Text.Encoding.UTF8.GetString(blob);
and this one for retrieve
byte blob2 =Encoding.UTF8.GetBytes(this.TableAdapter.GetDataByID(key).Rows[0]["Report"].ToString());
also can try this one to use varbinary(max) check it out :
Enable File Stream in SQL
Related
I have an ExecuteSQL processor that returns a SQL Server varbinary field for a particular row:
select [File]
from dbo.Attachment
where attachmentid=?
The query will find one row. The content gets stored in Avro. The retrieved File could be a text format (CSV, HTML, etc) or a binary format (PDF, Office docs, images, etc).
If the content is text, I can run it through ConvertAvroToJSON and then EvaluateJsonPath to get the content that I want. That doesn't work with the binary content, however. When I download the content of a flowfile that has, say, a PowerPoint file, PowerPoint complains about the content.
I'd like to have the Content of my FlowFile be just the binary content (I'll be sending it on to a PutMarkLogic processor later). How can I do that?
I did not test it.
but you could use ExecuteGroovyScript as workaround to write binary field directly to a file content.
SQL.mydb - add this parameter on the level of processor and link it to required DBCP pool.
AttributeWithID - i assume there is a flow file attribute with this name that contains value to be used in sql query for attachmentid
def ff=session.get()
if(!ff)return
SQL.mydb.eachRow("""
select [File]
from dbo.Attachment
where attachmentid=${ff.AttributeWithID}
"""){row->
ff.write{outStream->
outStream << row.getBinaryStream(1)
}
}
REL_SUCCESS << ff
I need to encrypt data saved onto my DB. I am currently using spring and hibernate to save data.
I have looked at some materials and tried to implement the code, however, it has resulted in various generic errors, some of the material was not targeted to MySQL etc.
Here's the code that has got me furthest
#Column(name="disability_description")
#Length(max=500)
#ColumnTransformer(
read = "AES_DECRYPT(disability_description, 'mykey')",
write = "AES_ENCRYPT(?, 'mykey')"
)
private String disabilityDescription;
This, however, doesn't work as I get the following errors
org.hibernate.exception.GenericJDBCException: could not execute statement
java.sql.SQLException: Incorrect string value: '\xF9\x82u\x01\x99\x1A...' for column 'disability_description' at row 1
Please point in the right direction. I am lost. Also mykey doesn't point to anything, I just entered a random word.
I doubt that your column is not of type BINARY:
Mysql Doc:
AES_ENCRYPT() encrypts the string str using the key string key_str and
returns a binary string containing the encrypted output.
I read that inorder to populate binary values for Insert query you need to create a PreparedStatement and then use setBytes() API to set the byte array as the binary parameter.
My problem is that when i do the same I get "data exception: String data,right truncation".
I read about this that this might come if we populate a value of size more than the declared size. But here I am using a very small byte [] ("s".getbytes()).
I also tried setBinaryStream() but with the same result!
I also tried setting null value. Still I get the same error.
The length of the VARBINARY or LONGVARBINARY column must be enough to accept the data you are inserting. Your CREATE TABLE statement can contain VARBINARY as the type of the column, allowing up to 16MB per each data item.
If you use BINARY as the type, it means only one byte is allowed.
I know how to store and retrieve data using isolated storage. My problem is how to sort the data I want to recover from that has already been stored previously. All my data is stored in a single file.
The user stores the data everyday and maybe on a particular date he makes two entries and none on some other day. In that case how should I search for the particular days info I need.
And could you also explain how data is stored in the isolated storage that is in packets of data or some other way? If I store two data sets in same file, does it automatically shift to a next line for storing other data or do I have to specify for it to do so?
Moreover, if I want to save data in the same line, does it automatically separate the data in a line by some tab or inserting some character in between two data sets in a line or does the developer have to take care of this?
If the date is saved in the title of the file then you can use the following code to search for it:
var appStorage = IsolatedStorageFile.GetUserStoreForApplication();
string date = appStorage.GetFileNames("the date you are looking for")
In answer to your other question if you use .Write() when writing text to a file, then it will create one long stream of data, if you use .WriteLine() then you will write the information, then a new line reference will be added at the end.
I'm not sure as you weren't very clear, but just in-case, here is a general procedure for reading files from IsolatedStorage:
var appStorage = IsolatedStorageFile.GetUserStoreForApplication();
using (StreamReader reader = new StreamReader(appStorage.OpenFile(fileName, FileMode.Open, FileAccess.Read)))
{
fileContent = reader.ReadToEnd();
}
How to view a BLOB data, can i export it to text file? I am using Oracle SQL developer 5.1. When i tried
select utl_raw.cast_to_varchar2(dbms_lob.substr(COLNAME))
from user_settings where <fieldname>=...
It returns the following error: ORA-06502 PL/SQL : numeric or value error : raw variable length too long
The BLOB contains text in XML format.
To view xml data stored as a BLOB, do the following;
Open the table view, with the Data tab selected
Double click on the column field value, and a pencil button should appear in the field. Click the pencil button.
The Edit Value window should open, click checkbox for 'View As: Text'. From this window you can also save out any particular file data you require.
PS: I'm running Oracle SQL Developer version 3.1.05
Cause it over the size of the display field. It needs to set the size
You add 1500 for the substr, it should be work.
select utl_raw.cast_to_varchar2(dbms_lob.substr(colname,1500))
from user_settings where <row_id>=...
BLOB data is typically just... a binary blob of data.
Sure, you can export it to a text file by converting it to some kind of text representation... But what if it is an image?
jaganath: You need to sit down and figure out what it is you're dealing with, and then find out what it is you need to do.
You could look at DBMS_LOB.CONVERTTOCLOB
But if it is XML, why store it in a BLOB rather than an XMLType (or CLOB)
From the error message it seems that the blob length is too long to fit into a varchar. You could do the conversion in your application code and write the XML into a String or file.