Tuesday, 25 November 2014

[SOLVED] FAILED: SemanticException [Error 10294]: Attempt to do update or delete using transaction manager that does not support these operations in hive-0.14.0


CRUD operations are supported in Hive from 0.14 onwards.
See Wiki 

Hive supports data warehouse software facility,which facilitates querying and managing large datasets residing in distributed storage. In data warehouse there are situation where we need to update, delete etc transactions.In hive later versions UPDATE was not supported,but there were workarounds to do update a transaction

1. Update Statement In Hive For Small Tables
2. Update Statement In Hive For Large Tables using INSERT


Lets see how to do INSERT,UPDATE,DELETE in newer version of hive. 

Create a table "test"
CREATE EXTERNAL TABLE 
    test (EmployeeID Int,FirstName String,Designation  
        String,Salary Int,Department String) 
    ROW FORMAT DELIMITED FIELDS TERMINATED BY  "," 
    LOCATION '/user/hdfs/Hive';
We will try to update the salary of employee id 19 from 45,000 to 50,000.
 hive> UPDATE test 
           SET salary = 50000 
           WHERE employeeid = 19;

 FAILED: SemanticException [Error 10294]: Attempt to do update or delete using transaction m anager that does not support these operations.

While applying above query it shows a semantic Exception.In order to allow update and delete we need to add additional settings in hive-site.xml and create table with ACID output format support.

To achieve the same follow below steps:

1. New Configuration Parameters for Transactions
 hive.support.concurrency – true
 hive.enforce.bucketing – true
 hive.exec.dynamic.partition.mode – nonstrict
 hive.txn.manager –org.apache.hadoop.hive.ql.lockmgr.DbTxnManager
 hive.compactor.initiator.on – true
 hive.compactor.worker.threads – 1
You can set these configuration in hive-site.xml (after setting restart Hive ) for ever or via terminal.
Dont Forget to restart Hive once the above settings are applied, else you will get the same error again.
2. Below query creates HiveTest table with ACID support
(To do Update,delete or Insert we need to create a table that support ACID properties)
 create table HiveTest 
   (EmployeeID Int,FirstName String,Designation String,
     Salary Int,Department String) 
   clustered by (department) into 3 buckets 
   stored as orc TBLPROPERTIES ('transactional'='true') ;
3. Load data into HiveTest from a staging table,which contains the original data.
 from stagingtbl 
   insert into table HiveTest 
   select employeeid,firstname,designation,salary,department;

4. UPDATE,DELETE and INSERT operations


1.UPDATE
 update HiveTest 
    set salary = 50000 
    where employeeid = 19; 

SYNOPSIS

  1. The referenced column must be a column of the table being updated.
  2. The value assigned must be an expression that Hive supports in the select clause.  Thus arithmetic operators, UDFs, casts, literals, etc. are supported.  Subqueries are not supported.
  3. Only rows that match the WHERE clause will be updated.
  4. Partitioning columns cannot be updated.
  5. Bucketing columns cannot be updated.
  6. In Hive 0.14, upon successful completion of this operation the changes will be auto-committed.


2. INSERT
 insert into table HiveTest 
     values(21,'Hive','Hive',0,'B');

SYNOPSIS

  1. Each row listed in the VALUES clause is inserted into table tablename.
  2. Values must be provided for every column in the table.  The standard SQL syntax that allows the user to insert values into only some columns is not yet supported.  To mimic the standard SQL, nulls can be provided for columns the user does not wish to assign a value to.
  3. Dynamic partitioning is supported in the same way as for INSERT...SELECT.
  4. If the table being inserted into supports ACID and a transaction manager that supports ACID is in use, this operation will be auto-committed upon successful completion.



3. DELETE
 delete from HiveTest
     where employeeid=19;

SYNOPSIS
  1. Only rows that match the WHERE clause will be deleted.
  2. In Hive 0.14, upon successful completion of this operation the changes will be auto-committed.

270 comments:

  1. Hi, i am using hive 0.14 and followed steps all as mentioned above and i have set new configuration parameters as you said. But still i am getting same error

    SemanticException [Error 10294]: Attempt to do update or delete using transaction m anager that does not support these operations.

    Please advice

    ReplyDelete
    Replies
    1. Can you show the configurations you set?And where did you set the parameters?

      Delete
    2. Hi, I am getting the same error. i have set the parameters in hive shell,itself

      Delete
    3. Hi,

      Can you please post what steps you followed?

      Delete
    4. Hi Sameekshya,

      You need set those new properties in Hive-Site.xml and restart the Hive server and create tables, check update,insert and delete

      Delete

  2. hive.support.concurrency
    true

    Whether Hive supports concurrency control or not.
    A ZooKeeper instance must be up and running when using zookeeper Hive lock manager



    hive.enforce.bucketing
    true
    Whether bucketing is enforced. If true, while inserting into the table, bucketing is enforced.


    hive.exec.dynamic.partition.mode
    nonstrict

    In strict mode, the user must specify at least one static partition
    in case the user accidentally overwrites all partitions.
    In nonstrict mode all partitions are allowed to be dynamic.



    hive.txn.manager
    org.apache.hadoop.hive.ql.lockmgr.DbTxnManager

    Set to org.apache.hadoop.hive.ql.lockmgr.DbTxnManager as part of turning on Hive
    transactions, which also requires appropriate settings for hive.compactor.initiator.on,
    hive.compactor.worker.threads, hive.support.concurrency (true), hive.enforce.bucketing
    (true), and hive.exec.dynamic.partition.mode (nonstrict).
    The default DummyTxnManager replicates pre-Hive-0.13 behavior and provides
    no transactions.



    hive.compactor.initiator.on
    true

    Whether to run the initiator and cleaner threads on this metastore instance or not.
    Set this to true on one instance of the Thrift metastore service as part of turning
    on Hive transactions. For a complete list of parameters required for turning on
    transactions, see hive.txn.manager.



    hive.compactor.worker.threads
    1

    How many compactor worker threads to run on this metastore instance. Set this to a
    positive number on one or more instances of the Thrift metastore service as part of
    turning on Hive transactions. For a complete list of parameters required for turning
    on transactions, see hive.txn.manager.
    Worker threads spawn MapReduce jobs to do compactions. They do not do the compactions
    themselves. Increasing the number of worker threads will decrease the time it takes
    tables or partitions to be compacted once they are determined to need compaction.
    It will also increase the background load on the Hadoop cluster as more MapReduce jobs
    will be running in the background.


    In hive-site.xml, can you pls chare your contact number srinivas.thunga@gmail.com

    ReplyDelete
    Replies
    1. For a small clarification , can u set it directly in terminal instead of hive-site.xml and try?

      Delete
    2. hive> hive.support.concurrency=true
      hive> hive.enforce.bucketing=true
      hive> hive.exec.dynamic.partition.mode=nonstrict
      hive> hive.txn.manager=org.apache.hadoop.hive.ql.lockmgr.DbTxnManager
      hive> hive.compactor.initiator.on=true
      hive> hive.compactor.worker.threads=1

      Delete
  3. directly means, can you let me know the command how to set directly on terminal plsssssss

    ReplyDelete
  4. yes, i did the same from hive console,

    But still i am not able to update and delete.

    getting the same error.

    Pls help

    ReplyDelete
    Replies
    1. Is that the same error or something like FAILED: SemanticException [Error 10297]: Attempt to do update or delete on table default.test that does not use an AcidOutputFormat or is not bucketed

      Delete
    2. yes same error

      SemanticException [Error 10294]: Attempt to do update or delete using transaction manager that does not support these operations

      Delete
  5. yes same error like

    SemanticException [Error 10294]: Attempt to do update or delete using transaction manager that does not support these operations.

    Only single record insert is working fine.
    Not able delete and update

    ReplyDelete
    Replies
    1. After setting the parameters did you created a table with ACID property???

      Delete
    2. It is mensioned in this blog itself.
      1. New Configuration Parameters for Transactions
      2. Creates Hive table with ACID support
      3. Load data into Hive table
      4. Do UPDATE,DELETE and INSERT

      Delete
    3. Hi Sreeveni,

      Thanks a lot for your support.

      Its working fine.

      Able to insert, update and delete records.

      Delete
    4. Hello sreevani and vasu,
      Even i have made all the configuration settings,but still getting the same error as:
      "FAILED: SemanticException [Error 10294]: Attempt to do update or delete using transaction manager that does not support these operations"
      and am setting them in the terminal,can u please help me in resolving this error?

      Thank you.

      Delete
    5. HI Tejashwini,

      Can post in details step by step you did till now? And which version of Hive you are using???

      Delete
    6. Hello sreevani and vasu,
      Even i have made all the configuration settings,but still getting the same error as:
      "FAILED: SemanticException [Error 10294]: Attempt to do update or delete using transaction manager that does not support these operations"
      and am setting them in the terminal,can u please help me in resolving this error?

      Thank you.

      Delete
    7. Instead of setting direct in Terminal can u please do the same in Hive-site.xml and restart the hive-server

      Delete
    8. can you please show your create statement.

      Delete
  6. Hi Sreevani,

    I am also getting the similar issue for update

    Error is :
    FAILED: SemanticException [Error 10294]: Attempt to do update or delete using transaction manager that does not support these operations.

    I have already setted parmaters as in this blog

    After that tried create , Insert and update but still not worked.


    ReplyDelete
    Replies
    1. After setting the parameters, have you restarted hive server and Metastore?

      Delete
    2. Yes you are right srinivas. Please restart Hive.

      Delete
    3. But sreeveni, can we insert with selected columns like

      insert into table A id,name select id,name from B

      Delete
  7. Hi, I am new to hive
    How to restart hive and Metastore ?
    can you please help me to do this.
    Thanks,
    Nalin

    ReplyDelete
    Replies
    1. Sure go to where Hive was extracted

      $ service --status-all

      check hive-server2 and metastore is running

      $ sudo service hive-server2 restart

      Delete
  8. I am setting at the session level
    In case if i need to do change configurations in hive, hive-site.xml is not available in hive-1.0.0 right how to do that

    ReplyDelete
    Replies
    1. You can set it in terminal aswell.

      hive> hive.support.concurrency=true;
      hive> hive.enforce.bucketing=true;
      hive> hive.exec.dynamic.partition.mode=nonstrict;
      hive> hive.txn.manager=org.apache.hadoop.hive.ql.lockmgr.DbTxnManager;
      hive> hive.compactor.initiator.on=true;
      hive> hive.compactor.worker.threads=1;

      Delete
    2. I did that at session level also but still i am not able to update

      Delete
    3. please find below what I am doing

      hive> set hive.support.concurrency=true;
      hive> set hive.enforce.bucketing=true;
      hive> set hive.exec.dynamic.partition.mode=nonstrict;
      hive> set hive.txn.manager=org.apache.hadoop.hive.ql.lockmgr.DbTxnManager;
      hive> set hive.compactor.initiator.on=true;
      hive> set hive.compactor.worker.threads=1;
      hive> create table HiveTest3(EmployeeID Int,FirstName String,Designation String ,Salary Int,Department String) clustered by (department) into 3 buckets stored as orc TBLPROPERTIES ('transactional'='true') ;
      OK
      Time taken: 0.519 seconds
      hive> Insert into table HiveTest3 select * from HiveTest1;
      Query ID = nalin_20150220010303_6feec16c-c945-4abc-98c9-fb5c13e9fcda
      Total jobs = 1
      Launching Job 1 out of 1
      Number of reduce tasks is set to 0 since there's no reduce operator
      Starting Job = job_1424286757124_0011, Tracking URL = http://bgengmst.ibm.com:8 088/proxy/application_1424286757124_0011/
      Kill Command = /opt/hadoop/hadoop-2.3.0/bin/hadoop job -kill job_1424286757124 _0011
      Hadoop job information for Stage-1: number of mappers: 2; number of reducers: 0
      2015-02-20 01:03:35,704 Stage-1 map = 0%, reduce = 0%
      2015-02-20 01:03:52,221 Stage-1 map = 50%, reduce = 0%, Cumulative CPU 1.26 se c
      2015-02-20 01:03:58,400 Stage-1 map = 100%, reduce = 0%, Cumulative CPU 2.51 s ec
      MapReduce Total cumulative CPU time: 2 seconds 510 msec
      Ended Job = job_1424286757124_0011
      Loading data to table dbtest.hivetest3
      Table dbtest.hivetest3 stats: [numFiles=1, numRows=1, totalSize=502, rawDataSiz e=268]
      MapReduce Jobs Launched:
      Stage-Stage-1: Map: 2 Cumulative CPU: 2.51 sec HDFS Read: 1165 HDFS Write: 664 SUCCESS
      Total MapReduce CPU Time Spent: 2 seconds 510 msec
      OK
      Time taken: 31.73 seconds
      hive> select * from HiveTest3;
      OK
      1 test mgr 1000 b
      Time taken: 0.13 seconds, Fetched: 1 row(s)
      hive> update HiveTest3 set salary = 2000 where EmployeeId = 1;
      FAILED: SemanticException [Error 10294]: Attempt to do update or delete using t ransaction manager that does not support these operations.
      hive>

      Delete
    4. Can you pls set the same thing Hive-Site.xml and restart the server

      Delete
  9. set in Hive-Site.xml file and restart server

    ReplyDelete
    Replies
    1. I don't see any Hive-Site.xml in conf directory, if not available how to create hive-site.xml

      Delete
    2. if posssible, can u please paste conf directary folders

      Delete
    3. [nalin@bgengmst conf]$ ls
      beeline-log4j.properties.template hive-env.sh.template
      configuration.xsl hive-exec-log4j.properties.template
      hive-default.xml.template hive-log4j.properties.template
      hive-env.sh hive-site.xml-snarveso

      Delete
    4. check the hive-default.xml file, u can find the properties hive.support.concurrency

      Then rename the file to Hive-Site.xml

      Delete
    5. I tried to rename it as hive-site.xml but I am not able to connect hive

      Delete
    6. No need to connect to Hive to rename. come out from Hive then go to Hive where it got extracted and rename the file

      Delete
    7. I have copied the file hive-default.xml.template as hive-site.xml in conf directory after that I tried to connect ./bin/hive interface so I am getting errors
      if possible can you provide me ur mobile number so I can explain the scenario exactly what I did

      Delete
    8. my emailid nalinikanth7@gmail.com

      Delete
    9. from where you are starting Hive? Are you using cloudera?

      Delete

    10. Logging initialized using configuration in jar:file:/opt/apache-hive-1.0.0-bin/lib/hive-common-1.0.0.jar!/hive-log4j.properties
      SLF4J: Class path contains multiple SLF4J bindings.
      SLF4J: Found binding in [jar:file:/opt/hadoop/hadoop-2.3.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
      SLF4J: Found binding in [jar:file:/opt/apache-hive-1.0.0-bin/lib/hive-jdbc-1.0.0-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
      SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
      SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
      Exception in thread "main" java.lang.RuntimeException: java.lang.IllegalArgumentException: java.net.URISyntaxException: Relative path in absolute URI: ${system:java.io.tmpdir%7D/$%7Bsystem:user.name%7D
      at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:444)
      at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:626)
      at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:570)
      at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
      at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      at java.lang.reflect.Method.invoke(Method.java:606)
      at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
      Caused by: java.lang.IllegalArgumentException: java.net.URISyntaxException: Relative path in absolute URI: ${system:java.io.tmpdir%7D/$%7Bsystem:user.name%7D
      at org.apache.hadoop.fs.Path.initialize(Path.java:206)
      at org.apache.hadoop.fs.Path.(Path.java:172)
      at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:487)
      at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:430)
      ... 7 more
      Caused by: java.net.URISyntaxException: Relative path in absolute URI: ${system:java.io.tmpdir%7D/$%7Bsystem:user.name%7D
      at java.net.URI.checkPath(URI.java:1804)
      at java.net.URI.(URI.java:752)
      at org.apache.hadoop.fs.Path.initialize(Path.java:203)
      ... 10 more

      Delete
    11. Java HotSpot(TM) 64-Bit Server VM warning: Insufficient space for shared memory file:
      /tmp/hsperfdata_nalin/24733
      Try using the -Djava.io.tmpdir= option to select an alternate temp location.

      Error: Could not find or load main class fs

      Delete
  10. login as: nalin
    nalin@10.64.131.139's password:
    Last login: Fri Feb 20 03:35:34 2015 from 10.2.192.187
    [nalin@bgengmst ~]$ cd $HIVE_HOME
    [nalin@bgengmst apache-hive-1.0.0-bin]$ ./bin/hive
    Unable to determine Hadoop version information.
    'hadoop version' returned:
    Java HotSpot(TM) 64-Bit Server VM warning: Insufficient space for shared memory file: /tmp/hsperfdata_nalin/25370 Try using the -Djava.io.tmpdir= option to select an alternate temp location. Hadoop 2.3.0 Subversion http://svn.apache.org/repos/asf/hadoop/common -r 1567123 Compiled by jenkins on 2014-02-11T13:40Z Compiled with protoc 2.5.0 From source with checksum dfe46336fbc6a044bc124392ec06b85 This command was run using /opt/hadoop/hadoop-2.3.0/share/hadoop/common/hadoop-common-2.3.0.jar
    [nalin@bgengmst apache-hive-1.0.0-bin]$ hdfs fs -ls /
    Java HotSpot(TM) 64-Bit Server VM warning: Insufficient space for shared memory file:
    /tmp/hsperfdata_nalin/25445
    Try using the -Djava.io.tmpdir= option to select an alternate temp location.

    Error: Could not find or load main class fs
    [nalin@bgengmst apache-hive-1.0.0-bin]$

    ReplyDelete
  11. check etc/init.d for services. Then u can service name to start manually

    ReplyDelete
  12. this is not having services related to hive
    [nalin@bgengmst etc]$ cd init.d
    [nalin@bgengmst init.d]$ ls
    abrt-ccpp hadoop mdmonitor psacct sandbox
    abrtd haldaemon messagebus quota_nld saslauthd
    abrt-oops halt netconsole rdisc single
    acpid ip6tables netfs restorecond smartd
    atd iptables network rngd sshd
    auditd irqbalance nfs rpcbind svnserve
    blk-availability kdump nfslock rpcgssd sysstat
    cpuspeed killall nimbus rpcidmapd udev-post
    crond lvm2-lvmetad ntpdate rpcsvcgssd xe-linux-distribution
    functions lvm2-monitor postfix rsyslog zookeeper

    ReplyDelete
    Replies
    1. [nalin@bgengmst apache-hive-1.0.0-bin]$ cd /var/log
      [nalin@bgengmst log]$ ls
      anaconda.ifcfg.log ConsoleKit maillog-20150222 spooler
      anaconda.log cron mesos spooler-20150217
      anaconda.program.log cron-20150217 messages spooler-20150222
      anaconda.storage.log cron-20150222 messages-20150217 storage.log
      anaconda.syslog dmesg messages-20150222 tallylog
      anaconda.yum.log dmesg.old prelink wtmp
      audit dracut.log sa yum.log
      boot.log lastlog secure zookeeper
      btmp maillog secure-20150217
      btmp-20150217 maillog-20150217 secure-20150222

      Delete
    2. Can you please try out the same in CDH 5 (cloudera)

      Delete
  13. I am using Hive 1.1.0 and facing same issues as Nalini mentioned above. Is you problem resolved?

    Seems this is the only thread over internet discussion hive transaction setup. Any idea how can we reach Hive team so that they can help?

    Smilevasu, would you be please willing to help by getting into livemeeting? I understand thats too much to ask. In case you decide so, my email ID is amitg2k@gmail.com

    ReplyDelete
  14. Hi Amit, what you are facing issue on Hive? If possible paste the issue. I sent a request to your mail id.

    ReplyDelete
    Replies
    1. I setup all configuration as above in hive-default.xml.template and saved it as hive-default.xml in CONF directory. However there was no benefit. I created new table as mentioned in step 2 above, I was still getting same error for update:

      SemanticException [Error 10294]: Attempt to do update or delete using transaction manager that does not support these operations.

      I configured parameters using CLI:

      hive> set hive.support.concurrency=true;
      hive> set hive.enforce.bucketing=true;
      hive> set hive.exec.dynamic.partition.mode=nonstrict;
      hive> set hive.txn.manager=org.apache.hadoop.hive.ql.lockmgr.DbTxnManager;
      hive> set hive.compactor.initiator.on=true;
      hive> set hive.compactor.worker.threads=1;

      Now I am unable to connect to metastore itself:

      select * from HiveTest;
      FAILED: LockException [Error 10280]: Error communicating with the metastore

      On repeated attempts to restart hive and setup parameters through CLI, I am unable to execute query.
      When i execute

      select * from HiveTest;

      Cursor moves to next line and keep blinking forever. There are only 4 records in this table and query works fine before setting abovementioned CLI parameters.

      BTW, i am unable to locate your email. Could you please resend?

      Delete
    2. For FAILED: LockException [Error 10280]: Error communicating with the metastore ,edit hive-site.xml
      In property tag with set name as hive.in.test and value as true
      and then restrart the service.

      Delete
  15. Many thanks #smilevasu for your time and valuable expertise. Could not have imagined and solved so many issues. Five cheers to you :)

    ReplyDelete
  16. How to release a lock? I killed the script using Ctrl+C and just to realize my table is locked:

    hive> show locks;
    OK
    Lock ID Database Table Partition State Type Transaction ID Last Hearbeat Acquired At User Hostname
    253 default stg_cards NULL WAITING EXCLUSIVE NULL 1429344374976 NULL amit ubuntu14
    Time taken: 0.054 seconds, Fetched: 2 row(s)

    ReplyDelete
  17. Hi smilevasu,
    after editing hive-site.xml how to restart hive? can you please mention steps?

    ReplyDelete
  18. Hi I am able to do update and delete in hive but not working in beeline and from jdbc java program.

    Can anyone help on this

    ReplyDelete
  19. Hello,

    Can you explain the table properties you set in the CREATE TABLE statement? For example you have:

    clustered by (department) into 3 buckets
    stored as orc TBLPROPERTIES ('transactional'='true') ;

    Is there a reason you set clustered by or stored as orc like that? is it required for these transactional hive functions?

    ReplyDelete
    Replies
    1. Hi Matthew ,
      I used a data of 3 different departments,so i clustered them into 3. and inorder to satisfy ACID properties we need to create table in orc format.Can u please try in normal table?

      Delete
  20. Hi unmesha

    I have done all the things u suggested to run acid properties, and as amit garg has done i have folloelwed the same steps but still facing a problem in insert update delete. Semantic error coming.

    Please help me out

    ReplyDelete
  21. Hi Unmesha

    Thanks a lot for the wonderful post. Finally you came for the rescue, this is what I have been searching for months.

    ReplyDelete
    Replies
    1. Hi Unmesha,

      Can you please provide a post on sqoop incremental updates. (not for new rows, for row level updates).

      Delete
  22. hive (default)> INSERT INTO table tomar VALUES(1,'TOM','Pun');
    Query ID = training_20160203110303_156d121d-5e82-4cfb-a2d3-ff97fc8dd0f9
    Total jobs = 1
    Launching Job 1 out of 1
    Number of reduce tasks determined at compile time: 5
    In order to change the average load for a reducer (in bytes):
    set hive.exec.reducers.bytes.per.reducer=
    In order to limit the maximum number of reducers:
    set hive.exec.reducers.max=
    In order to set a constant number of reducers:
    set mapreduce.job.reduces=
    Starting Job = job_1424080128507_5223, Tracking URL = http://hydetamaster:8088/proxy/application_1424080128507_5223/
    Kill Command = /opt/installation/hadoop/bin/hadoop job -kill job_1424080128507_5223
    Interrupting... Be patient, this might take some time.
    Press Ctrl+C again to kill JVM
    killing job with: job_1424080128507_5223


    Hive API stuck at kill command. Can anyone help me out with this ?

    ReplyDelete
  23. hive> update buildingupdatetst
    > set rilfeaturecode = 'L10000'
    > where riluniqueid = 'LO0102_0000023280354';


    FAILED: SemanticException [Error 10294]: Attempt to do update or delete using transaction manager that does not support these operations.

    can anyone help me above error

    ReplyDelete
  24. Hi,

    Please let me know hive tables you are created for this??? and let me know the steps you followed till now???

    ReplyDelete
  25. Have anyone tried updating tables in hive cli and it works and it doesnot work from java after setting configuration parameters mentioned above ? Any idea of solution for this ?

    ReplyDelete
  26. FAILED: Error in acquiring locks: Error communicating with the metastore

    Error: Error while processing statement: FAILED: Error in acquiring locks: Error
    communicating with the metastore (state=42000,code=10)

    Any idea about this ?

    ReplyDelete
  27. This comment has been removed by the author.

    ReplyDelete
  28. How to remove this error in case of external tables?

    ReplyDelete
  29. Is there a way to Update the row without having to edit the Hive-Site.xml

    ReplyDelete
  30. After reading this blog i very strong in this topics and this blog really helpful to all... explanation are very clear so very easy to understand... thanks a lot for sharing this blog

    hadoop training course syllabus | big data training and course syllabus

    ReplyDelete
  31. I am using Cloudera Hue and i am unable to locate hive-site.xml
    Can anyone help.

    ReplyDelete
  32. Awesome post presented by you..your writing style is fabulous and keep update with your blogs Hadoop Admin Online Training

    ReplyDelete
  33. It is nice blog Thank you provide important information and i am searching for same information to save my time Big Data Hadoop Online Training

    ReplyDelete
  34. Thanks for splitting your comprehension with us. It’s really useful to me & I hope it helps the people who in need of this vital information.
    Cloud computing Training in Chennai
    Hadoop Training in Chennai
    Cloud computing courses in Chennai
    Cloud Training in Chennai
    best big data training in chennai
    Big Data Hadoop Training

    ReplyDelete
  35. Awesome article. It is so detailed and well formatted that i enjoyed reading it as well as get some new information too.
    python Training institute in Pune
    python Training institute in Chennai
    python Training institute in Bangalore

    ReplyDelete
  36. Needed to compose you a very little word to thank you yet again regarding the nice suggestions you’ve contributed here.
    Microsoft azure training in Bangalore
    Power bi training in Chennai

    ReplyDelete
  37. Hello, I read your blog occasionally, and I own a similar one, and I was just wondering if you get a lot of spam remarks? If so how do you stop it, any plugin or anything you can advise? I get so much lately it’s driving me insane, so any assistance is very much appreciated.
    Android Course Training in Chennai | Best Android Training in Chennai
    Selenium Course Training in Chennai | Best Selenium Training in chennai
    Devops Course Training in Chennai | Best Devops Training in Chennai

    ReplyDelete
  38. Nice work, your blog is concept oriented ,kindly share more blogs like this
    Excellent Blog , I appreciate your hardwork ,it is useful
    It's Very informative blog and useful article thank you for sharing with us , keep posting learn more

    Tableau online Training

    Android Training

    Data Science Course

    Dot net Course

    ReplyDelete
  39. Get Mutual Fund Investment Schemes by Mutual Fund Wala and know about the best investment platform for you, to get profit.
    Mutual Fund Advisor

    ReplyDelete
  40. It is amazing and wonderful to visit your site.Thanks for sharing this information,this is useful to me...

    QlikView Online Training


    Quality Stage Online Training


    R Programming Online Training

    ReplyDelete
  41. Awesome information, visit our page lifestyle magazine to get the best fashion and lifestyle magazines.
    Lifestyle Magazine India

    ReplyDelete
  42. It’s very informative and you are obviously very knowledgeable in this field. Very solid content.

    ExcelR Data Science in Bangalore

    ReplyDelete
  43. It is very informative i set Configuration Parameters for Transactions it worked thank u
    data science courses training

    ReplyDelete
  44. Attend The Python training in bangalore From ExcelR. Practical Python training in bangalore Sessions With Assured Placement Support From Experienced Faculty. ExcelR Offers The Python training in bangalore.
    python training in bangalore

    ReplyDelete
  45. Going to graduate school was a positive decision for me. I enjoyed the coursework, the presentations, the fellow students, and the professors. And since my company reimbursed 100% of the tuition, the only cost that I had to pay on my own was for books and supplies. Otherwise, I received a free master’s degree. All that I had to invest was my time.

    Big Data Course

    ReplyDelete
  46. I am really enjoying reading your well written articles. It looks like you spend a lot of effort and time on your blog. I have bookmarked it and I am looking forward to reading new articles. Keep up the good work.data science course in dubai

    ReplyDelete
  47. I have to search sites with relevant information on given topic and provide them to teacher our opinion and the article.
    top 7 best washing machine
    www.technewworld.in

    ReplyDelete
  48. Nice article… very useful
    thanks for sharing the information.
    service now administration training

    ReplyDelete
  49. This comment has been removed by the author.

    ReplyDelete
  50. Really appreciate this wonderful post that you have provided for us.Great site and a great topic as well i really get amazed to read this. Its really good.
    How to increase domain authority in 2019
    www.technewworld.in

    ReplyDelete
  51. Awesome blog. I enjoyed reading your articles. This is truly a great read for me. I have bookmarked it and I am looking forward to reading new articles. Keep up the good work!
    AI learning course malaysia

    ReplyDelete
  52. Thank you for providing the valuable information …

    If you want to connect with AI (Artificial Intelligence) World
    as like
    Python Training
    ML(Machine Learning)

    Course related more information then meet on EmergenTeck Training Institute .

    Thank you.!

    ReplyDelete
  53. Thank you for providing the valuable information …

    If you want to connect with AI (Artificial Intelligence) World
    as like
    Python Training
    ML(Machine Learning)

    Course related more information then meet on EmergenTeck Training Institute .

    Thank you.!

    ReplyDelete
  54. Thank you for providing the valuable information …
    If you want to connect with AI (Artificial Intelligence) World
    as like
    Python

    RPA (Robotic Process Automation)


    UiPath Training


    Blue Prism


    Data -Science


    ML(Machine Learning)
    related more information then meet on EmergenTeck Training Institute .

    Thank you.!
    Reply

    ReplyDelete
  55. Thank you for providing the valuable information …
    If you want to connect with AI (Artificial Intelligence) World
    as like
    Python

    RPA (Robotic Process Automation)


    UiPath Training


    Blue Prism


    Data -Science


    ML(Machine Learning)
    related more information then meet on EmergenTeck Training Institute .

    Thank you.!
    Reply

    ReplyDelete
  56. Hi am not able to update in hive
    Can you please help me on this query. set all parameters as per your blog.

    Note :- I want this in command prompt only.

    Thanks,
    Arjun

    ReplyDelete
  57. Awesome, this post has helped me save many hours of browsing other related posts just to find what I was looking for Good work.Many thanks! keep it up.
    machine learning course bangalore

    ReplyDelete
  58. Data for a Data Scientist is what Oxygen is to Human Beings. data analytics courses This is also a profession where statistical adroit works on data – incepting from Data Collection to Data Cleansing to Data Mining to Statistical Analysis and right through Forecasting, Predictive Modeling and finally Data Optimization.

    ReplyDelete
  59. Trekking pole helps you to walk smoothly on slopes during hiking. best backpacks 2019 .They also help you to move through muddy soil and large bushes. With simple guidelines, you will know how to attach a trekking pole to backpack.

    ReplyDelete
  60. Nice content interesting to read. thanks for the post sharing. Visit Us Digital Marketing CourseHere we are dealing with all range of Digital marketing services

    ReplyDelete
  61. Azure is a open source sofftware .
    learn on azure technology through microsoft azure training

    ReplyDelete
  62. Wow it is really wonderful and awesome thus it is veWow it is really wonderful and awesome thus it is very much useful for me to understand many concepts and helped me a lot. it is really explainable very well and i got more information from your site.ry much useful for me to understand many concepts and helped me a lot. it is really explainable very well and i got more information from your site.python training in bangalore

    ReplyDelete
  63. Very useful and information content has been shared out here, Thanks for sharing it.Mulesoft training in bangalore

    ReplyDelete
  64. A debt of gratitude is in order for ExcelR Data Analytics Course In Pune the blog entry amigo! Keep them coming...

    ReplyDelete

  65. I am a regular reader of your blog and I find it really informative. Hope more Articles From You.Best Tableau tutorial video available Here. hope more articles from you.


    ReplyDelete
  66. Committed with 99% uptime, this Black Friday, A2 Hosting is providing 50% discount on its VPS plans and 67% OFF on shared as found on Black Friday Web Hosting Deals 2019 post. Interestingly, the core and managed VPS plans come with free SSD and SSL features along with the HostGuard management. Moreover, in case you have developed repulsion towards their services, you would get your invested money back.

    ReplyDelete
  67. thanks for sharing such an useful and informative stuff...

    data science tutorial

    ReplyDelete
  68. If you are looking for Partition Making Services in Dubai then HMD is the right choice. We have the best handyman for Partition Making Services in Dubai.

    ReplyDelete
  69. A great website with interesting and unique material what else would you need.

    360digitmg IOT Training Courses

    ReplyDelete


  70. Very Good Information...

    Data science Course in Pune


    Thank You Very Much For Sharing These Nice Tips..

    ReplyDelete


  71. Very Good Information...

    Data science Course in Pune


    Thank You Very Much For Sharing These Nice Tips..

    ReplyDelete
  72. I found the iot analytics services offered in this article to be the full-cycle iot development services for starting up with beneficial business skills.

    ReplyDelete
  73. It is perfect time to make some plans for the future and it is time to be happy. I’ve read this post and if I could I desire to suggest you few interesting things or tips. Perhaps you could write next articles referring to this article. I want to read more things about it!big data analytics malaysia
    data scientist certification malaysia
    data analytics courses

    ReplyDelete
  74. Best Data Science Course in Mumbai wherein we have classroom and online training. Along with Classroom training, we also conduct online training using state-of-the-art technologies to ensure the wonderful experience of online interactive learning. Best Data Science Course in Mumbai

    ReplyDelete
  75. Data Science Courses fee in Bangalore wherein we have classroom and online training. Along with Classroom training, we also conduct online training using state-of-the-art technologies to ensure the wonderful experience of online interactive learning. Data Science Courses fee in Bangalore

    ReplyDelete
  76. Best data science institute in bangalore wherein we have classroom and online training. Along with Classroom training, we also conduct online training using state-of-the-art technologies to ensure the wonderful experience of online interactive learning. Data Science Courses fee in Bangalore

    ReplyDelete
  77. Data science courses in navi mumbai wherein we have classroom and online training. Along with Classroom training, we also conduct online training using state-of-the-art technologies to ensure the wonderful experience of online interactive learning. Data science courses in navi mumbai

    ReplyDelete
  78. Snapdeal Winner List 2020 here came up with an Offer where you can win Snapdeal prize list by just playing a game & win prizes.
    Snapdeal winner name also check the Snapdeal lucky draw

    ReplyDelete
  79. Thanks for Sharing This Article.It is very so much valuable content. I hope these Commenting lists will help to my website
    servicenow online training
    best servicenow online training
    top servicenow online training

    ReplyDelete
  80. Really thanks for posting such an useful and informative stuff...

    Tableau Online Course

    ReplyDelete
  81. daythammynet
    Class College Education training Beauty teaching university academy lesson  teacher master student  spa manager  skin care learn eyelash extensions tattoo spray

    ReplyDelete
  82. Attend The Data Science Courses Bangalore From ExcelR. Practical Data Science Courses Bangalore Sessions With Assured Placement Support From Experienced Faculty. ExcelR Offers The Data Science Courses Bangalore.
    ExcelR Data Science Courses Bangalore
    Data Science Interview Questions
    ExcelR Data Analytics Courses
    ExcelR Business Analytics Course

    ReplyDelete
  83. Thanks for Posting such an useful & informative stuff...

    learn azure

    ReplyDelete
  84. Study Machine Learning Course Bangalore with ExcelR where you get a great experience and better knowledge .
    Machine Learning Course Bangalore

    ReplyDelete
  85. Great post i must say and thanks for the information. Education is definitely a sticky subject. However, is still among the leading ExcelR Machine Learning Courses topics of our time. I appreciate your post and look forward to more.

    ReplyDelete

  86. Study Machine Learning Training in Bangalore with ExcelR where you get a great experience and better knowledge .
    Business Analytics course

    ReplyDelete
  87. Such a very useful article. Very interesting to read this article.I would like to thank you for the efforts you had made for writing this awesome article. machine learning courses in Bangalore

    ReplyDelete
  88. Study Machine Learning Course Bangalore with ExcelR where you get a great experience and better knowledge .
    Machine Learning Course Bangalore

    ReplyDelete
  89. Study Artificial Intelligence Course in Bangalore with ExcelR where you get a great experience and better knowledge.
    Artificial Intelligence Course

    ReplyDelete
  90. Awesome Post!!! I really enjoyed reading this article. It's really a nice experience to read your post. Thanks for sharing.
    Data Science Course in Marathahalli

    ReplyDelete
  91. Study Business Analytics Course in Bangalore with ExcelR where you get a great experience and better knowledge.
    Business Analytics Course

    ReplyDelete
  92. Awesome blog, I enjoyed reading your articles. This is truly a great read for me. I have bookmarked it and I am looking forward to reading new articles. Keep up the
    good work!.business analytics certification

    ReplyDelete
  93. Just saying thanks will not just be sufficient, for the fantasti c lucidity in your writing. I will instantly grab your rss feed to stay informed of any updates.
    data analytics courses

    ReplyDelete
  94. Hi, your article was of great help. I loved the way you shared the information, thanks.
    Amazing article, I highly appreciate your efforts, it was highly helpful. Thank you CEH Training ,CEH Certification, CEH Online Course, Ethicalhacking

    ReplyDelete
  95. The content that I normally see is nothing like what you have written. This is very well-thought out and well-planned. You are a unique thinker and bring up great individualized points. Please continue your work.
    Best Data Science training in Mumbai

    Data Science training in Mumbai

    ReplyDelete
  96. okey indir
    indir okey
    okey oyna
    okey oyunu oyna
    okey oyunları
    bedava okey
    canlı okey
    online okey
    101 okey
    indirokey.com
    Okey İndir ve Okey Oyna, Sitemiz üzerinde sizlerde hemen okey oyunumuzu indirerek ve hemen okey oyunu oynaya bilirsiniz.

    ReplyDelete
  97. Thanks for the informative article About Data Science.This is one of the best resources I have found in quite some time. Nicely written and great info. I really cannot thank you enough for sharing.
    Java training in chennai | Java training in annanagar | Java training in omr | Java training in porur | Java training in tambaram | Java training in velachery

    ReplyDelete
  98. This is exactly the information I'm looking for, I couldn't have asked for a simpler read with great tips like this... Thanks! ExcelR Data Science Course In Pune

    ReplyDelete
  99. Super site! I am Loving it!! Will return once more, Im taking your food additionally, Thanks.
    data science course
    360DigiTMG

    ReplyDelete
  100. It is extremely nice to see the greatest details presented in an easy and understanding manner.
    data science course
    360DigiTMG

    ReplyDelete
  101. I like viewing web sites which comprehend the price of delivering the excellent useful resource free of charge. I truly adored reading your posting. Thank you!...business analytics certification

    ReplyDelete
  102. Great post i must say and thanks for the information. Education is definitely a sticky subject. However, is still among the leading topics of our time. I appreciate your post and look forward to more.
    data science course in malaysia

    ReplyDelete
  103. Excellent Blog! I would like to thank for the efforts you have made in writing this post. I am hoping the same best work from you in the future as well. I wanted to thank you for this websites! Thanks for sharing. Great websites!
    data science course in malaysia

    ReplyDelete
  104. What a really awesome post this is. Truly, one of the best posts I've ever witnessed to see in my whole life. Wow, just keep it up.
    Data Science Institute in Bangalore

    ReplyDelete
  105. I was just browsing through the internet looking for some information and came across your blog. I am impressed by the information that you have on this blog. It shows how well you understand this subject. Bookmarked this page, will come back for more....business analytics certification

    ReplyDelete
  106. With so many books and articles coming up to give gateway to make-money-online field and confusing reader even more on the actual way of earning money,
    Data Science Training in Bangalore

    ReplyDelete
  107. Pretty good post. I just stumbled upon your blog and wanted to say that I have really enjoyed reading your blog posts. Any way I’ll be subscribing to your feed and I hope you post again soon.
    Data Science Training in Bangalore

    ReplyDelete
  108. Thanks for sharing such nice information <a href ="https://www.excelr.com/data-science-course-training-hyderabad>Data Science Training in Hyderabad</a>

    ReplyDelete
  109. After reading your article I was amazed. I know that you explain it very well. Data Science Training in Hyderabad

    ReplyDelete
  110. I feel really happy to have seen your webpage and look forward to so many more entertaining times reading here. Thanks once more for all the details.

    Data Science Course

    ReplyDelete
  111. Really impressive post. I read it whole and going to share it with my social circles. I enjoyed your article and planning to rewrite it on my own blog.
    Best Data Science Courses in Bangalore

    ReplyDelete
  112. You are in point of fact a just right webmaster. The website loading speed is amazing. It kind of feels that you're doing any distinctive trick. Moreover, The contents are masterpiece. you have done a fantastic activity on this subject!
    Business Analytics Training in Hyderabad

    ReplyDelete
  113. Mmm.. good to be here in your article or post, whatever, I think I should also work hard for my own website like I see some good and updated working in your site.
    Data Science Course in Bangalore

    ReplyDelete
  114. Excellent Blog, it has been interesting. Visit Ogen Infosystem for the best Website Designing services in Delhi, India.
    Web Design Company

    ReplyDelete
  115. Very impressive and interesting blog found to be well written in a simple manner that everyone will understand and gain the enough knowledge from your blog being more informative is an added advantage for the users who are going through it. Once again nice blog keep it up.

    360DigiTMG Machine Learning Course

    ReplyDelete
  116. I would highly recommend my profile, I invite you to discuss this topic ...https://360digitmg.com/course/certification-program-in-data-science

    ReplyDelete
  117. Nice Post. Intent to provide valuable information and best practices.
    360DigiTMG Cyber Security

    ReplyDelete
  118. Very nice blogs!!! i have to learning for lot of information for this sites...Sharing for wonderful information.Thanks for sharing this valuable information to our vision. You have posted a trust worthy blog keep sharing, data science course in Hyderabad

    ReplyDelete
  119. Great information thank you for sharing very useful waiting for next blog update.
    Data Science Course in Hyderabad

    ReplyDelete
  120. Phenomenal post.I need to thank you for this enlightening read, I truly value sharing this incredible post.Keep up your work
    artificial intelligence courses in delhi

    ReplyDelete
  121. I like this post,And I figure that they making some incredible memories to scrutinize this post,they may take a good site to make an information,thanks for sharing it to me
    data science course in noida

    ReplyDelete
  122. I like this post,And I figure that they making some incredible memories to scrutinize this post,they may take a good site to make an information,thanks for sharing it to me
    data science course in noida

    ReplyDelete
  123. You should talk it's shocking. Your blog survey would extend your visitors. I was fulfilled to find this site.I expected to thank you for this phenomenal read!!
    data science courses in delhi

    ReplyDelete
  124. If you don't mind, then continue this excellent work and expect more from your great blog posts
    data science course noida

    ReplyDelete
  125. If you don't mind, then continue this excellent work and expect more from your great blog posts
    artificial intelligence course in noida

    ReplyDelete
  126. Really fine and interesting informative article. I used to be looking for this kind of advice and enjoyed looking over this one. Thank you for sharing.Learn 360DigiTMG tableau course in bangalore

    ReplyDelete