spark

  • A9_647509
    了解作者
  • 130.7KB
    文件大小
  • zip
    文件格式
  • 0
    收藏次数
  • VIP专享
    资源类型
  • 0
    下载次数
  • 2022-06-15 19:04
    上传日期
火花
spark-main.zip
内容介绍
# Apache Spark [Apache Spark](https://spark.apache.org/) is a high-performance engine for large-scale computing tasks, such as data processing, machine learning and real-time data streaming. It includes APIs for Java, Python, Scala and R. ## Azure-ready Charts with Containers from marketplace.azurecr.io This Helm Chart has been configured to pull the Container Images from the Azure Marketplace Public Repository. The following command allows you to download and install all the charts from this repository. ```bash $ helm repo add bitnami-azure https://marketplace.azurecr.io/helm/v1/repo ``` ## TL;DR ```console $ helm repo add bitnami-azure https://marketplace.azurecr.io/helm/v1/repo $ helm install my-release bitnami-azure/spark ``` ## Introduction This chart bootstraps a [spark](https://github.com/bitnami/bitnami-docker-spark) deployment on a [Kubernetes](http://kubernetes.io) cluster using the [Helm](https://helm.sh) package manager. Bitnami charts can be used with [Kubeapps](https://kubeapps.com/) for deployment and management of Helm Charts in clusters. This Helm chart has been tested on top of [Bitnami Kubernetes Production Runtime](https://kubeprod.io/) (BKPR). Deploy BKPR to get automated TLS certificates, logging and monitoring for your applications. ## Prerequisites - Kubernetes 1.12+ - Helm 3.1.0 ## Installing the Chart To install the chart with the release name `my-release`: ```console $ helm repo add bitnami-azure https://marketplace.azurecr.io/helm/v1/repo $ helm install my-release bitnami-azure/spark ``` These commands deploy Spark on the Kubernetes cluster in the default configuration. The [Parameters](#parameters) section lists the parameters that can be configured during installation. > **Tip**: List all releases using `helm list` ## Uninstalling the Chart To uninstall/delete the `my-release` statefulset: ```console $ helm delete my-release ``` The command removes all the Kubernetes components associated with the chart and deletes the release. Use the option `--purge` to delete all persistent volumes too. ## Parameters The following tables lists the configurable parameters of the spark chart and their default values. ### Global parameters | Parameter | Description | Default | |---------------------------|-------------------------------------------------|---------------------------------------------------------| | `global.imageRegistry` | Global Docker image registry | `nil` | | `global.imagePullSecrets` | Global Docker registry secret names as an array | `[]` (does not add image pull secrets to deployed pods) | ### Common parameters | Parameter | Description | Default | |---------------------|-----------------------------------------------------------------------------------------------------------|--------------------------------| | `nameOverride` | String to partially override common.names.fullname template with a string (will prepend the release name) | `nil` | | `fullnameOverride` | String to fully override common.names.fullname template with a string | `nil` | | `commonLabels` | Labels to add to all deployed objects | `{}` | | `commonAnnotations` | Annotations to add to all deployed objects | `{}` | | `kubeVersion` | Force target Kubernetes version (using Helm capabilities if not set) | `nil` | | `extraDeploy` | Array of extra objects to deploy with the release | `[]` (evaluated as a template) | ### Spark parameters | Parameter | Description | Default | |---------------------|--------------------------------------------------|---------------------------------------------------------| | `image.registry` | spark image registry | `docker.io` | | `image.repository` | spark Image name | `bitnami/spark` | | `image.tag` | spark Image tag | `{TAG_NAME}` | | `image.pullPolicy` | spark image pull policy | `IfNotPresent` | | `image.pullSecrets` | Specify docker-registry secret names as an array | `[]` (does not add image pull secrets to deployed pods) | ### Spark master parameters | Parameter | Description | Default | |---------------------------------------------|--------------------------------------------------------------------------------------------------------------------------------------------|---------------------------------------------| | `master.debug` | Specify if debug values should be set on the master | `false` | | `master.webPort` | Specify the port where the web interface will listen on the master | `8080` | | `master.clusterPort` | Specify the port where the master listens to communicate with workers | `7077` | | `master.hostAliases` | Add deployment host aliases | `[]` | | `master.daemonMemoryLimit` | Set the memory limit for the master daemon | No default | | `master.configOptions` | Optional configuration if the form `-Dx=y` | No default | | `master.securityContext.enabled` | Enable security context | `true` | | `master.securityContext.fsGroup` | Group ID for the container | `1001` | | `master.securityContext.runAsUser` | User ID for the container | `1001` | | `master.securityContext.seLinuxOptions` | SELinux options for the container | `{}` | | `master.podAnnotations` | Annotations for pods in StatefulSet
评论
    相关推荐
    • spark
      火花 有史以来最好的约会应用程序 入门 该项目是Flutter应用程序的起点。 如果这是您的第一个Flutter项目,那么有一些资源可以帮助您入门: 要获得Flutter入门方面的帮助,请查看我们的,其中提供了教程,示例,...
    • spark
      适用于Python的课程笔记本和适用于大数据的Spark 课程幻灯片:Python和大数据的火花 Spark DataFrames Spark DataFrames部分介绍 Spark DataFrame基础 Spark DataFrame操作 分组和汇总功能 缺失数据 日期和时间戳 ...
    • Spark
      Spark
    • Spark
      该书属于spark的进阶,通过源码探究核心问题
    • spark
      NULL 博文链接:https://fypop.iteye.com/blog/2255490
    • Spark
      Spark
    • spark
      下载Spark 本地版本: 日期:2021年3月 Spark版本3.1.1; 程序包类型“为Hadoop 2.7预先构建”; openjdk 15.0.2; python 3.9.2和pyspark; PySpark外壳 在Linux / OS X上启动PySpark $HOME/spark-3.1.1-bin-...
    • Spark
      Spark
    • Spark
      Spark
    • GaussDB_100_1.0.1-DATABASE-REDHAT-64bit.tar.gz
      guassdb100在redhat上安装包,单机部署的包,安装步骤请看我的文中介绍,经过大量实验搭建总结出来的文档