-
Notifications
You must be signed in to change notification settings - Fork 49
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
initial poc commit of DynamoDB Mapper #1232
Conversation
A new generated diff is ready to view.
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'll probably have more comments but you said to ignore most of the code so I'm not sure where to draw the line 🤷♂️
I would suggest some changes to the package layout/names:
hll/
ddb/
ddb-mapper-runtime/
ddb-mapper-annotations/
ddb-annotation-processor/
tests/
ddb-annotation-process-tests/
The modules like annotations
and processor
do not retain their project hierarchy when published. All the context around belonging to hll/dynamodb-mapper
is gone. They would simply be at GAV coordinates like aws.sdk.kotlin:annotations:<version>
. You can of course control the name but it's easier/safer to just bake it into the defaults Gradle/MavenPublisher use by naming it consistent with the artifact name.
I made comments elsewhere on package namespace.
hlls/build.gradle.kts
Outdated
|
||
description = "High-level libraries for the AWS SDK for Kotlin" | ||
|
||
// FIXME 🔽🔽🔽 This is all copied from :aws-runtime and should be commonized 🔽🔽🔽 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
maybe
* Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. | ||
* SPDX-License-Identifier: Apache-2.0 | ||
*/ | ||
package aws.sdk.kotlin.hlls.dynamodbmapper |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
suggestion: aws.sdk.kotlin.hll.dynamodbmapper
or aws.sdk.kotlin.hll.ddbmapper
public annotation class DdbAttribute(val name: String) | ||
|
||
@Target(AnnotationTarget.CLASS) | ||
public annotation class DdbItem |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do we have enough to write documentation on these? If not let's add a TODO in here to add docs.
package aws.sdk.kotlin.hlls.dynamodbmapper | ||
|
||
@Target(AnnotationTarget.PROPERTY) | ||
public annotation class DdbAttribute(val name: String) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
bikeshed: I'm not usually a huge fan of repeating information. In this case the annotation is a member of the aws.sdk.kotlin.hll.dynamodbmapper
namespace and so DdbAttribute
is redundant. On the other the generic names like Item
and Attribute
are likely to have conflicts and force import alias usage (making code harder to read consistently).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah I struggled with the names of these and decided to prefix Ddb
because of where they'll primarily be used—in user code decorating user data classes. In generic DAO layers, I expect it'll be far common to deal with concerns for various use cases and vendors—particularly around ways to transform the shape into something usable externally:
package com.foo.bar
import aws.sdk.kotlin.hlls.dynamodbmapper.*
import kotlinx.serialization.*
@DdbItem
@Serializable
data class Project(
@DdbAttribute("myDatabaseId")
@SerialName("myJsonId")
val id: Int,
@DdbAttribute("myDatabaseName")
@SerialName("myJsonName")
val name: String,
)
I'm not fixed on this pattern, though, and I'm open to bikeshedding. Item
is already a type in the mapper runtime. Attribute
is not yet but I expect it could be as we write more code and need a wrapper type for Pair<String, AttributeValue>
. Do you think identically named attributes and runtime types could cause confusion?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do you think identically named attributes and runtime types could cause confusion?
Possibly. FWIW I'm just bringing this up for discussion I'm not even sure where I land on it just yet. I think your choices are sensible and perhaps even desirable from a readability standpoint for code that consumes these annotations.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In line with the comment below, I'll change the prefixes of these attributes from Ddb
to DynamoDb
.
import aws.sdk.kotlin.hlls.dynamodbmapper.schemas.ItemSchema | ||
import aws.sdk.kotlin.services.dynamodb.DynamoDbClient | ||
|
||
public class DdbMapper(public val client: DynamoDbClient) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Documentation?
I know this was from POC code but I tend to find being forced to write documentation to be clarifying. It also reduces the amount of docs we have to produce later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm going to refresh the DdbMapper
class as part of the next runtime task of supporting mapper initialization. I'll defer docs until that point.
public fun toMutableItem(): MutableItem = MutableItem(delegate.toMutableMap()) | ||
} | ||
|
||
public fun buildItem(block: MutableItem.() -> Unit): Item = MutableItem(mutableMapOf()).apply(block).toItem() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should be inline
} | ||
|
||
public interface SerializeInput<I, HReq> { | ||
public val highLevelRequest: HReq |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
simple request
would suffice right?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This highLevelRequest
member gets pulled into basically every interface that follows this one. The distinction highLevel
becomes important when there's also a lowLevelRequest
in the type.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ok I'm going to defer this bikeshed until we get to it 😉
- change package and project names - add initial documentation for annotations - add TODOs around docs, interface abstraction, unit tests, etc.
A new generated diff is ready to view.
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The overall project structure and Gradle setup looks good to me!
val keyProp = checkNotNull(props.singleOrNull { it.isPk }) { | ||
"Expected exactly one @DdbPartitionKey annotation on a property" | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
simplification: props.single { ... }
instead of checkNotNull(props.singleOrNull { ... })
, unless you specifically want the exception thrown by checkNotNull
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think throwing a specific exception type will be important but this whole processor implementation is just a strawman...it's going to be wholesale replaced once we figure out a codegen model we like.
import aws.sdk.kotlin.services.dynamodb.DynamoDbClient | ||
|
||
// TODO refactor to interface, add support for multi-table operations, document, add unit tests | ||
public class DdbMapper(public val client: DynamoDbClient) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
naming: not sure if we discussed this name already, and I don't have a strong opinion, but DynamoDbMapper
would align closer to the low-level client name and the package dynamodbmapper
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You're right, DynamoDbMapper
is a better name. Will change.
} | ||
} | ||
|
||
public class SingleKey<I, PK> internal constructor( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
naming: now that we've changed to CompositeKey (below), I feel like SingleKey maybe should be changed to PartitionKey to align to DynamoDB docs?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Overall structure looks like a good starting point.
- renaming DdbMapper to DynamoDbMapper - changing Ddb prefix for attributes to DynamoDb
Quality Gate passedIssues Measures |
A new generated diff is ready to view.
|
…lin (#1451) * initial poc commit of DynamoDB Mapper (#1232) * add support for Mapper initialization (#1237) * implement mapper pipeline (#1266) * initial implementation of codegen for low-level operations/types (#1357) * initial implementation of secondary index support (#1375) * Create new codegen module and refactor annotation processor to use it (#1382) * feat: add Schema generator Gradle plugin (#1385) * Fix plugin test package * add attribute converters for "standard" values (#1381) * fix: schema generator plugin test module (#1394) * feat: annotation processor codegen configuration (#1392) * feat: add `@DynamoDbIgnore` annotation (#1402) * DDB Mapper filter expressions (runtime components) (#1401) * feat: basic annotation processing (#1399) * add DSL overloads, paginators, and better builder integration for DDB Mapper ops codegen (#1409) * chore: split dynamodb-mapper-codegen into two modules (#1414) * emit DDB_MAPPER business metric (#1426) * feat: setup DynamoDbMapper publication (#1419) * DDB Mapper filter expressions (codegen components) (#1424) * correct docs * mark every HLL/DDBM API experimental (#1428) * fix accidental inclusion of expression attribute members in high-level DynamoDB Mapper requests (#1432) * Upgrade to latest build plugin version * fix: various issues found during testing (#1450) * chore: update Athena changelog notes for 1.3.57 (2024-10-18) release (#1449) * feat: update AWS API models * feat: update AWS service endpoints metadata * chore: release 1.3.60 * chore: bump snapshot version to 1.3.61-SNAPSHOT * feat: initial release of Developer Preview of DynamoDB Mapper for Kotlin * Fix Kotlin gradle-plugin version * fix: ddb mapper tests (#1453) * Bump build plugin version --------- Co-authored-by: Matas <lauzmata@amazon.com> Co-authored-by: aws-sdk-kotlin-ci <aws-kotlin-sdk-automation@amazon.com>
Issue #
#76
Description of changes
This cleans up and commits the working POC for a DynamoDB Mapper. This code bears some similarity to the imagined final state but will be updated by later tasks to commit to more definitively flesh out things like a request pipeline, mapper initialization, secondary indices, testing, KDocs, a Gradle plugin for the process, and a rewrite of the codegen layer.
For now I'd like to solicit feedback on:
Project outline
By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.