Catalog Details
CATEGORY
deploymentCREATED BY
UPDATED AT
April 08, 2024VERSION
1.0
What this pattern does:
A batch workload is a process typically designed to have a start and a completion point. You should consider batch workloads on GKE if your architecture involves ingesting, processing, and outputting data instead of using raw data. Areas like machine learning, artificial intelligence, and high performance computing (HPC) feature different kinds of batch workloads, such as offline model training, batched prediction, data analytics, simulation of physical systems, and video processing. By designing containerized batch workloads, you can leverage the following GKE benefits: An open standard, broad community, and managed service. Cost efficiency from effective workload and infrastructure orchestration and specialized compute resources. Isolation and portability of containerization, allowing the use of cloud as overflow capacity while maintaining data security. Availability of burst capacity, followed by rapid scale down of GKE clusters.
Caveats and Consideration:
Ensure proper networking of components for efficient functioning
Compatibility:
Recent Discussions with "meshery" Tag
- Apr 07 | Regarding [Bug]: Connection page shows error in "Local Provider" #10595
- Apr 03 | Meshery Development Meeting | 3rd April 2024
- Apr 02 | Open Request for Comments: Depiction of the Model Relationship Evaluation Cycle
- Mar 28 | Meshery Build and Release | March 28th 2024
- Mar 27 | Meshery Development Meeting | 27th March 2024
- Mar 13 | Badge leveling system proposal
- Mar 20 | While running the command Make-server. localhost shows 404 not found. Are there any possible solution to fix also please suggest the setting up the project using docker route
- Feb 23 | Local Environment Setup TroubleShooting error
- Mar 20 | Meshery Development Meeting | March 20th 2024
- Mar 11 | [Help Wanted] A list of open DevOps-centric needs on Meshery projects