Transition from cipher 2.0 to 3.0 fails. Error 26 on attempt to read - sqlcipher

I've used sqlcipher for 2 years. Yesterday I've upgraded to version 3.0.1 and tried to compile sqlcipher including arm64.
If I install new version of my app I can use new cipher lib without any problems.
But when I try to upgrade my previous version with DB made with sqlcipher 2.0 I get error 26.
It seems that new cipher can't decrypt my DB.
Also I tried to compile without arm64 support. The same problem.

I've solved my problem by using
PRAGMA cipher_migrate
which help to migrate from older DB structures to sqlcipher 3.0 (Details).
It must be executed right after setting the key.
If you want to read old DB (1.X/2.X) with new sqlcipher 3.0 use
PRAGMA kdf_iter = 4000
to set old value for kdf_iter. Now it equals 64,000 (Details)
In terms of lib sqlite db connection looks as follows:
int errorCode = SQLITE_ERROR;
sqlite3 *database = NULL;
errorCode = sqlite3_open_v2(path, &database, SQLITE_OPEN_READWRITE, NULL);
if (errorCode == SQLITE_OK) {
errorCode = sqlite3_key(database, key, (int)strlen(key));
}
if (errorCode == SQLITE_OK) {
errorCode = sqlite3_exec(database, "PRAGMA kdf_iter = 4000", NULL, NULL, NULL);
}

Related

IOCreatePlugInInterfaceForService failed w/ kIOReturnNoResources/0xe00002be

IOCreatePlugInInterfaceForService failed w/ kIOReturnNoResources/0xe00002be
I am rewriting old FireWire based command line utility into XPCService. I need some help about an IOKit function.
Following part is to get IOCFPlugInInterface for FireWireAVCLibUnit.(almost same as original code; basic idea comes from legacy simpleAVC samplecode).
When I call IOCreatePlugInInterfaceForService() in the XPCService, it always failed returning 0xe00002be = kIOReturnNoResources in IOReturn.h.
I have confirmed no sandbox, no hardened for the XPC Service.
Original command line utility works perfectly on macOS 10.14 though, would you someone give me a hint on this topic?
CFDictionaryRef dict = CFDictionaryCreateCopy(kCFAllocatorDefault, self.dict);
kern_return_t result = IOServiceGetMatchingServices(kIOMasterPortDefault, dict, &serviceIterator);
if (result == KERN_SUCCESS && serviceIterator != IO_OBJECT_NULL) {
while ((service = IOIteratorNext(serviceIterator)) != IO_OBJECT_NULL) {
SInt32 score = 0;
kern_return_t result = IOCreatePlugInInterfaceForService(service,
kIOFireWireAVCLibUnitTypeID,
kIOCFPlugInInterfaceID,
&interface,
&score);
if (result != KERN_SUCCESS) continue;
// result 0xe00002be = kIOReturnNoResources in IOReturn.h
break;
}
}
Additional details
I have find IOCFPlugIn.c in opensource.apple.com. After basic verification,
- IOCreatePlugInInterfaceForService() failed to IOCFPlugIn->Start() .
(*iunknown)->QueryInterface(iunknown, CFUUIDGetUUIDBytes(interfaceType),
(LPVOID *)&interface);
<snip>
kr = (*interface)->Probe(interface, plist, service, &score);
<snip>
haveOne = (kIOReturnSuccess == (*interface)->Start(interface, plist, service));
Probe() returned kIOReturnSuccess though,
Start() failed w/ kIOReturnNoDevice = 0xe00002c0. and haveOne = false.
Finally IOCreatePlugInInterfaceForService() returned kIOReturnNoResources = 0xe00002be.
Is this related to some security feature on macOS?
MODIFIED
I have found hardened runtime with Camera access was rejected FireWireAVCLibUnit (tccd shows error).
Even if no sandbox, no hardened for the XPC Service in Xcode was checked, XPCservice is handled via sandbox. (macOS 10.14.6 + Xcode 10.3)
I would appreciate if you have an advice.
I have found the solution.
- Add NSCameraUsageDescription in Info.plist, and IOFireWireAVCUserClient will work.
- If sandboxed, com.apple.security.device.firewire is also required.
Even if capabilities-sandbox is off, tccd verify info.plist.
If “Privacy - Camera Usage Description” is not available, sandboxd reject to use IOFireWireAVCUserClient device.
Information Property List Key Reference/Cocoa Keys

WSS4J SHA1 value different when using IBM JDK 7 versus Oracle JDK 7

I need help understanding why I am getting a different SHA1 digest value when using IBM JDK 7 versus Oracle JDK 7 along with the WSS4J library.
What I am trying to do now is force the use of Sun JCE, having moved Sun JARs to my IBM JDK. The reason is because my digest values are coming out right using Oracle JDK 7 and the digest values match the IRS's web service's calculation of them.
I am using canonicalization (C14N_EXCL_WITH_COMMENTS) in my code. But I am getting a different SHA1 hash for a an XML element when using IBM JDK 7 versus Oracle JDK 7. The XML that I am calculating a hash value for only has constants in it. Nothing changes in the below XML but I get different hash values:
<urn:ACATransmitterManifestReqDtl
xmlns:wsu="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-utility-1.0.xsd"
wsu:Id="aCATransmitterManifestReqDtl">
<urn:PaymentYr>2015</urn:PaymentYr>
<urn:PriorYearDataInd>0</urn:PriorYearDataInd>
<urn:TransmissionTypeCd>O</urn:TransmissionTypeCd>
<urn:TestFileCd>T</urn:TestFileCd>
</urn:ACATransmitterManifestReqDtl>
I am getting a different SHA1 value with the following code:
WSSecSignature wsSecSignature = new WSSecSignature(config);
wsSecSignature.setX509Certificate(signingCert);
wsSecSignature.setUserInfo(alias, new String(keystorePassword.toCharArray()));
wsSecSignature.setUseSingleCertificate(true);
wsSecSignature.setKeyIdentifierType(WSConstants.X509_KEY_IDENTIFIER);
wsSecSignature.setDigestAlgo(WSConstants.SHA1);
wsSecSignature.setSignatureAlgorithm(WSConstants.RSA_SHA1);
wsSecSignature.setSigCanonicalization(WSConstants.C14N_EXCL_WITH_COMMENTS);
try {
Document document = toDocument(message);
WSSecHeader secHeader = new WSSecHeader();
//secHeader.setMustUnderstand(true);
secHeader.insertSecurityHeader(document);
WSSecTimestamp timestamp = new WSSecTimestamp();
timestamp.setTimeToLive(signatureValidityTime);
document = timestamp.build(document, secHeader);
List<WSEncryptionPart> wsEncryptionParts = new ArrayList<WSEncryptionPart>();
WSEncryptionPart timestampPart = new WSEncryptionPart("Timestamp",
WSConstants.WSU_NS, "");
WSEncryptionPart aCATransmitterManifestReqDtlPart = new WSEncryptionPart(
"ACATransmitterManifestReqDtl",
"urn:us:gov:treasury:irs:ext:aca:air:7.0", "");
WSEncryptionPart aCABusinessHeaderPart = new WSEncryptionPart(
"ACABusinessHeader",
"urn:us:gov:treasury:irs:msg:acabusinessheader", "");
wsEncryptionParts.add(timestampPart);
wsEncryptionParts.add(aCATransmitterManifestReqDtlPart);
wsEncryptionParts.add(aCABusinessHeaderPart);
wsSecSignature.setParts(wsEncryptionParts);
Properties properties = new Properties();
properties.setProperty("org.apache.ws.security.crypto.provider",
"org.apache.ws.security.components.crypto.Merlin");
Crypto crypto = CryptoFactory.getInstance(properties);
KeyStore keystore = KeyStore.getInstance("JKS");
EDIT WHAT I'VE TRIED, LOOKING FOR HELP FORCING PROVIDER TO BE SUN JCE
I was hoping to enter the following code Provider sunJCE = Security.getProvider("SunJCE") but it is coming back null
I edit my java.security file to include the sun.security.provider.SunJCE:
security.provider.1=com.ibm.jsse2.IBMJSSEProvider2
security.provider.2=com.ibm.crypto.provider.IBMJCE
security.provider.3=com.ibm.security.jgss.IBMJGSSProvider
security.provider.4=com.ibm.security.cert.IBMCertPath
security.provider.5=com.ibm.security.sasl.IBMSASL
security.provider.6=com.ibm.xml.crypto.IBMXMLCryptoProvider
security.provider.7=com.ibm.xml.enc.IBMXMLEncProvider
security.provider.8=com.ibm.security.jgss.mech.spnego.IBMSPNEGO
security.provider.9=sun.security.provider.Sun
security.provider.10=com.sun.crypto.provider.SunJCE
security.provider.11=sun.security.provider.Sun
security.provider.12=sun.security.rsa.SunRsaSign
security.provider.13=sun.security.jgss.SunProvider
And I am programmatically trying to use the SunJCE as follows:
Provider sunJCE = Security.getProvider("SunJCE");
if (sunJCE != null) {
logger.info("SunJCE Java Cryptography Extension (JCE) to provide cryptographic, key and hash algorithms : IBMJCE will be removed");
try {
Security.removeProvider("IBMJCE");
Security.insertProviderAt(sunJCE, 1);
} catch (SecurityException se) {
logger.info("Cannot move SunJCE to top priority", se);
}
}
I also moved these JARs from Oracle JDK 7 to IBM JDK 7: (1) copied the sunjce_provider.jar jar file in to WAS_HOME/java/jre/lib/ext
folder and (2) copied the jce.jar file in to was_home/java/jre/lib

open a .mdb database with a visual studio 6 program

I try openning a .mdb file using CDaoDatabase, but at Open() catches an error: Unrecognised database format. I created the database first in MSAcces2007 and saved the file as .mdb, then i installed MSAcces2003 and created the file again, but there's the same error. Does anyone have a clue what's happening?
CString pathDB = "SMACDB\\Transports.mdb";
CDaoDatabase dbTransp;
try
{
dbTransp.Open(pathDB);
CDaoRecordset rs(&dbTransp);
COleVariant var1;
rs.Open(dbOpenSnapshot, "SELECT * FROM Transporturi");
while (!rs.IsEOF())
{
var1 = rs.GetFieldValue(1);
CString val = (LPCTSTR)var1.bstrVal;
g_carRestrict.pCarNmb.AddTail(val);
var1 = rs.GetFieldValue(2);
g_carRestrict.pAllowed.AddTail(var1.lVal);
rs.MoveNext();
}
rs.Close();
dbTransp.Close();
}
catch (CDaoException *pEx)
{
pEx->Delete();
}
Visual C++ 6 uses DAO 3.5 by default which does not support Access 2000 or later formats. To have MFC uses DAO 3.6, change the runtime version number to 6.01.
Suggested reading:
You receive an "Unrecognized database format" error message when you open a database created with Access 2000

Error while uploading image to database

When I am trying to upload an Image using below code, I am getting following error : java.sql.SQLException: ORA-01460: unimplemented or unreasonable conversion requested
File image = new File("D:/"+fileName);
preparedStatement = connection.prepareStatement(query);
preparedStatement.setString(1,"Ayush");
fis = new FileInputStream(image);
preparedStatement.setBinaryStream(2, (InputStream)fis, (int)(image.length()));
int s = preparedStatement.executeUpdate();
if(s>0) {
System.out.println("Uploaded successfully !");
flag = true;
}
else {
System.out.println("unsucessfull to upload image.");
flag = false;
}
Please help me out.
DB Script :
CREATE TABLE ESTMT_SAVE_IMAGE
(
NAME VARCHAR2(50),
IMAGE BLOB
)
Its first cause is incompatible conversion but after seeing your DB script, I assume that you are not doing any conversion in your script.
There are other reported causes of the ORA-01460 as well:
Incompatible character sets can cause an ORA-01460
Using SQL Developer, attempting to pass a string to a bind variable value in excess of 4000 bytes can result in an ORA-01460
With ODP, users moving from the 10.2 client and 10.2 ODP to the 11.1 client and 11.1.0.6.10 ODP reported an ORA-01460 error. This was a bug that should be fixed by patching ODP to the most recent version.
Please see this

Porting TurboPower Blowfish to .Net

I have an application that was originally written in Borland C++ and used a Blowfish algorithm implemented in the TurboPower LockBox component .
This application has now been ported to C#. Currently I call a Borland C++ dll that uses this algorithm. However, when running the application on a 64-bit OS, I get errors whenever attempting to use this dll. If I compile the application as 32-bit, everything works, but we want to have this application work as a 64-bit app. As far as I can tell, that means I need a .Net Blowfish algorithm that works like the C++ one.
I found Blowfish.Net and it looks promising. However, when I use the same key and text the encrypted results do not match. I did find out the C++ dll uses the BlowfishECB algorithm. It also converts the result to Base 64, which I have also done.
Any help with this would be appreciated. Here is some test code in C#.
//Convert the key to a byte array. In C++ the key was 16 bytes long
byte[] _key = new byte[16];
Array.Clear(_key, 0, _key.Length);
var pwdBytes = System.Text.Encoding.Default.GetBytes(LicEncryptKey);
int max = Math.Min(16, pwdBytes.Length);
Array.Copy(pwdBytes, _key, max);
//Convert the string to a byte[] and pad it to to the 8 byte block size
var decrypted = System.Text.Encoding.ASCII.GetBytes(originalString);
var blowfish = new BlowfishECB();
blowfish.Initialize(_key, 0, _key.Length);
int arraySize = decrypted.Length;
int diff = arraySize%BlowfishECB.BLOCK_SIZE;
if (diff != 0)
{
arraySize += (BlowfishECB.BLOCK_SIZE - diff);
}
var decryptedBytes = new Byte[arraySize];
Array.Clear(decryptedBytes, 0, decryptedBytes.Length);
Array.Copy(decrypted, decryptedBytes, decrypted.Length);
//Prepare the byte array for the encrypted string
var encryptedBytes = new Byte[decryptedBytes.Length];
Array.Clear(encryptedBytes, 0, encryptedBytes.Length);
blowfish.Encrypt(decryptedBytes, 0, encryptedBytes, 0, decryptedBytes.Length);
//Convert to Base64
string result = Convert.ToBase64String(encryptedBytes);
It won't compatible with your TurboPower LockBox data.
I'd suggest that you provide a utility to do the data migration by decoding using LockBox in C++ (32-bit), outputting to temp files/tables and re-encoding using Blowfish.Net and C# (64-bit).
This data migration is done once before any upgrade to the .NET version, then it's all compatible with it.
Since you're changing the format: you could also change the format and omit the Base64 conversion by storing binary files/BLOBs, other ideas may also be useful like applying multiple encryptions, or replacing Blowfish by something else.

Resources