clear query| facets| time Search criteria: .   Results from 1 to 10 from 1029 (0.0s).
Loading phrases to help you
refine your search...
kafka connection from docker - Kafka - [mail # user]
...I do not understand this.You have on a physical host running zookeeper locally and a broker runningas well. These are using default physical host portslike 2181, 9092 etc.Then you have insta...
   Author: Mich Talebzadeh , 2019-10-17, 21:36
org.apache.spark.util.SparkUncaughtExceptionHandler - Spark - [mail # user]
...Hi Nimmi,Can you send us the spark parameters with overhead. assuming you arerunning with yarnExample[4] - 864GB--num-executors 32--executor-memory 21G--executor-cores 4--conf spark.yarn.exe...
   Author: Mich Talebzadeh , 2019-10-10, 21:10
Google Cloud and Spark in the docker consideration for rreal time streaming data - Spark - [mail # user]
...Hi,We have designed an efficient trading system that runs on cluster of threenodes in Google Cloud Platform (GCP).Each node runs one Zookeeper and one Kafka broker. So in total we havethree ...
   Author: Mich Talebzadeh , 2019-09-23, 19:05
[expand - 1 more] - Control Sqoop job from Spark job - Spark - [mail # user]
...Hi,Just to clarify, JDBC connection to RDBMS from Spark is slow?This one read from an Oracle table with 4 connections in parallel to Oracletable assuming there is a primary key on the Oracle...
   Author: Mich Talebzadeh , 2019-09-02, 18:06
Questions for platform to choose - Spark - [mail # user]
...Hi,What is the definition of real time here?The engineering definition of real time is roughly fast enough to beinteractive. However, I put a stronger definition. In real time applicationor ...
   Author: Mich Talebzadeh , 2019-08-21, 09:12
Unable to write data from Spark into a Hive Managed table - Spark - [mail # user]
...Check your permissioning.Can you do insert select from external table into Hive managed tablecreated by spark?//// Need to create and populate target ORC table transactioncodes_ll indatabase...
   Author: Mich Talebzadeh , 2019-08-09, 16:51
Spark SQL reads all leaf directories on a partitioned Hive table - Spark - [mail # user]
...also need others as well using soft link ls -lcd $SPARK_HOME/confhive-site.xml -> ${HIVE_HOME/conf/hive-site.xmlcore-site.xml -> ${HADOOP_HOME}/etc/hadoop/core-site.xmlhdfs-site.xml -&...
   Author: Mich Talebzadeh , 2019-08-08, 19:05
Sharing ideas on using Databricks Delta Lake - Spark - [mail # user]
...I upgraded my Spark to 2.4.3 that allows using the storage layer Delta Lake . Actually, I wishDatabricks would have chosen a different name for it :)Anyhow although most example of stor...
   Author: Mich Talebzadeh , 2019-08-07, 21:06
[expand - 3 more] - Hive external table not working in sparkSQL when subdirectories are present - Hive - [mail # user]
...Have you updated partition statistics by any chance?I assume you can access the table and data though Hive itself?HTHDr Mich TalebzadehLinkedIn * https://www.linkedin.com/profile/view?id=AAE...
   Author: Mich Talebzadeh , 2019-08-07, 20:51
How to read configuration file parameters in Spark without mapping each parameter - Spark - [mail # user]
...Hi,Assume that I have a configuration file as below with static parameterssome Strings, Integer and Double:md_AerospikeAerospike {dbHost = "rhes75"dbPort = "3000"dbConnection = "trading_user...
   Author: Mich Talebzadeh , 2019-08-06, 21:34