clear query| facets| time Search criteria: .   Results from 1 to 6 from 6 (0.0s).
Loading phrases to help you
refine your search...
[SPARK-29771] Limit executor max failures before failing the application - Spark - [issue]
...ExecutorPodsAllocator does not limit the number of executor errors or deletions, which may cause executor restart continuously without application failure.A simple example for this, add --co...
http://issues.apache.org/jira/browse/SPARK-29771    Author: Jackey Lee , 2019-11-16, 07:47
[SPARK-29444] Add configuration to support JacksonGenrator to keep fields with null values - Spark - [issue]
...DataSet.toJSON will lost some column when field data is null. Maybe it is better to keep null data in some scenarios.Such as sparkmagic, which is widely used in jupyter with livy, we use toJ...
http://issues.apache.org/jira/browse/SPARK-29444    Author: Jackey Lee , 2019-10-29, 19:51
[SPARK-24630] SPIP: Support SQLStreaming in Spark - Spark - [issue]
...At present, KafkaSQL, Flink SQL(which is actually based on Calcite), SQLStream, StormSQL all provide a stream type SQL interface, with which users with little knowledge about streaming,  can...
http://issues.apache.org/jira/browse/SPARK-24630    Author: Jackey Lee , 2019-09-16, 18:20
[SPARK-26912] Allow setting permission for event_log - Spark - [issue]
...Spark_event_log is set to 770 permissions by default. This will be a problem in the following scenarios.1, the user has some ugis that can not be added to the same group2, the user, who crea...
http://issues.apache.org/jira/browse/SPARK-26912    Author: Jackey Lee , 2019-07-16, 16:41
[SPARK-25937] Support user-defined schema in Kafka Source & Sink - Spark - [issue]
...    Kafka Source & Sink is widely used in Spark and has the highest frequency in streaming production environment. But at present, both Kafka Source and Link use the fixed schema, which ...
http://issues.apache.org/jira/browse/SPARK-25937    Author: Jackey Lee , 2019-07-16, 16:41
[SPARK-26394] Annotation error for Utils.timeStringAsMs - Spark - [issue]
...Utils.timeStringAsMs() is parsing time to milliseconds, but in annotation, it says "Convert a time parameter such as (50s, 100ms, or 250us) to microseconds for internal use."Thus, microsecon...
http://issues.apache.org/jira/browse/SPARK-26394    Author: Jackey Lee , 2018-12-18, 18:18