I'm trying to drop a few tables at once on a ClickHouse cluster. As example:
DROP TABLE IF EXISTS default.log_null, default.null_view
I get the error:
Code: 62, e.displayText() = DB::Exception: Syntax error: failed at position 39 (','): , default.null_view. Expected one of: NO DELAY, end of query, INTO OUTFILE, SETTINGS, ON, FORMAT, SYNC (version 21.3.13.9 (official build))
What is the correct syntax if such action is supported?
Related
In my data Address column having this information(Tamilnadu, HNo: 90, India,Ramasamy). While loading data from Oracle to Snowflake we are getting below error.
Unable to copy files into table.
Numeric value '1,"Tamilnadu, HNo: 90, India,Ramasamy"' is not
recognized File '#SAMPLE_TEST/ui1675860748054/test1.csv', line 1,
character 1 Row 1, column "SAMPLE_TEST"["ID":1] If you would like to
continue loading when an error is encountered, use other values such
as 'SKIP_FILE' or 'CONTINUE' for the ON_ERROR option. For more
information on loading options, please run 'info loading_data' in a
SQL client
NOTE: We have two columns in our table (ID, Address)
kindly help me to resolve the issue.
Thank you.
Data should be copy to snowflake as below format "Tamilnadu, HNo: 90, India,Ramasamy".
Loading as --- "Tamilnadu
expected as --- "Tamilnadu, HNo: 90, India,Ramasamy"
I have a table with column action LowCardinality(String),
but I want to change this column to -> action Nullable(String) and I am getting this error:
Code: 473, e.displayText() = DB::Exception: READ locking attempt on
"glassbox.beacon_event" has timed out! (120000ms) Possible deadlock
avoided. Client should retry.: While executing Columns (version
20.4.2.9 (official build))
Also the client is stuck (tabix).
If i will run this command like this, it works:
alter table test modify column action String
alter table test modify column action Nullable(String)
Why can't I run with one command?
alter table test modify column action Nullable(String)
probably it's a bug. Try Ch version 20.6
DB-240000 ODBC error: [Hortonworks][Hardy] (80) Syntax or semantic
analysis error thrown in server while executing query.
Error message from server: Error while compiling statement: FAILED:
ParseException line 1:12 cannot recognize input near 'ALL_ROWS'
'*' '/' in hint name SQLState: 37000
Sample query
WDB-200001 SQL statement 'SELECT /*+ ALL_ROWS */ A.test FROM table A' could not be executed.
Syntax looks right as per documentation (https://docs.oracle.com/cd/E11882_01/server.112/e41084/sql_elements006.htm#SQLRF51108)
Or is there a missing param on the odbc configuration?
https://hortonworks.com/wp-content/uploads/2015/10/Hortonworks-Hive-ODBC-Driver-User-Guide.pdf
Use Native Query Key Name Default Value Required UseNativeQuery Clear
(0) No Architecting the Future of Big Data Hortonworks Inc. Page 71
Description When this option is enabled (1), the driver does not
transform the queries emitted by an application, so the native query
is used. When this option is disabled (0), the driver transforms the
queries emitted by an application and converts them into an equivalent
from in HiveQL. Note: If the application is Hive-aware and already
emits Hive
Could this be an issue with HDP versioning?
Is there a missing Param
in the ODBC connection string?
I have been trying remote tables in monetdb. I have setup a remote table mapped to an existing table on a monetdb instance running on another host.
I am able to perform some basic queries, However I am unable to use the where clause: If I try to execute a simple query like :
select * from "T1" where product_id > 1757;
The query execution fails with the error (of course the query runs fine on the local table):
TypeException:user.l4[6]:'algebra.thetasubselect' undefined in: algebra.thetasubselect(X_40:bat[:int],X_41:any,">":str);
SQLState: 22000
ErrorCode: 0
Error: (mapi:monetdb://monetdb#192.168.1.46/visokio) 'algebra.thetasubselect' undefined in: algebra.thetasubselect(X_46:bat[:int],X_47:any,">":str);
SQLState: 22000
ErrorCode: 0
I receive similar errors on every query containing a where clause.
Am I doing something wrong? Is there a reason why I cannot execute a query containing a where clause?
Thank you for your help.
The following query is giving me the error:
Execute error: Error while processing statement: FAILED: Execution Error, return code 2 from
org.apache.hadoop.hive.ql.exec.mr.MapRedTask
Does anyone know why or how to resolve this issue?
proc sql;
connect to hadoop(server='xxx' port=10000 schema=xxx SUBPROTOCOL=hive2 sql_functions=all);
execute(
create table a as
select
*,
lag(claim_flg,1) over (order by ptnt_id,month) as lag1
from b
) by hadoop;
disconnect from hadoop;
quit;
It appears to be a limitation issue in HIVE database:
Hive Limit of 127 Expressions per Table
Due to a limitation in the Hive database, tables can contain a maximum of 127 expressions. When the 128th expression is read, the directive fails and the SAS log receives a message similar to the following:
ERROR: java.sql.SQLException: Error while processing statement: FAILED:
Execution Error, return
code 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask
ERROR: Unable to execute Hadoop query.
ERROR: Execute error.
SQL_IP_TRACE: None of the SQL was directly passed to the DBMS.
The Hive limitation applies anytime a table is read as part of a directive. For SAS Data Loader, the error can occur in aggregations, profiles, when viewing results, and when viewing sample data.
Source: http://support.sas.com/documentation/cdl/en/dmddug/67908/HTML/default/viewer.htm#p1fl149uastoudn1v7r2u5ff8aft.htm