This post is the continuation from the previous post [AWS -Python] DynamoDB with Boto3, which deals with the basic DynamoDB operations.
Continue reading “[AWS -Python] DynamoDB Batch and Paging with Boto3”Tag Archives: Python
[AWS -Python] DynamoDB with Boto3
Let’s work with DynamoDB with the Python Boto3 library.
We can
- list tables
- retrieve table information including keys and indexes
- create a table
- read, write, and delete items
- delete a table
https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/dynamodb.html
Continue reading “[AWS -Python] DynamoDB with Boto3”[AWS] Lambda with Python
This post shows variable small examples of Lambda functions to demonstrate how you can do with Lambda. The choice of language is Python but it is easy to migrate to other programming languages.
Continue reading “[AWS] Lambda with Python”[Spark By Example] Read SQL Server
The following sample code (by C#) shows how to read data from the Microsoft SQL Server.
Continue reading “[Spark By Example] Read SQL Server”[Spark By Example] Schema
Spark can infer the data structure, but you can explicitly specify the data by providing the Schema to the DataFrame.
Continue reading “[Spark By Example] Schema”[Spark By Example] SparkSession
The following sample code (by Python and C#) shows how to use SparkSession.
SparkSession
- SparkSession is an entry point to your spark application since Spark version 2.
- SparkSession wraps all different contexts (SparkContext, SQLContext, HiveContext, …) to a single entry point.
- You can create as many SparkSessions as you want.
- In the Spark shell, such as PySpark shell, the SparkSession object (named as “spark”) is created for you.
- In the application, you need to create a SparkSession object.
[Spark By Example] Explode and Collect
In Spark, you dealing with array or map data as they come. The following example shows how to convert collection data to rows and vice versa.
Continue reading “[Spark By Example] Explode and Collect”[Spark By Example] Spark SQL – UDFs
In Spark SQL, you can define your custom functions and use them in the SQL statement. The following example shows how to create a very simple UDF, register it, and use it in the SQL.
Continue reading “[Spark By Example] Spark SQL – UDFs”