Releases: takapi327/ldbc
v0.5.0
ldbc v0.5.0 is released. 🎉
This release brings major enhancements to the ecosystem with ZIO support, advanced authentication capabilities, and significant security and performance improvements.
Note
ldbc is pre-1.0 software and is still undergoing active development. New versions are not binary compatible with prior versions, although in most cases user code will be source compatible.
The major version will be the stable version.
🎉 Release Highlights
- ZIO Ecosystem Integration: Complete ZIO support through the new
ldbc-zio-interopmodule - Enhanced Authentication: Pure Scala3 authentication plugins including AWS Aurora IAM support
- Security Improvements: Enhanced SQL parameter escaping and SSRF attack protection
- API Enhancements: File-based query execution with the new
updateRawsmethod - Performance Optimizations: Maximum packet size configuration and improved connection pool concurrency
What's Changed
🆕 New Modules
ldbc-zio-interop: ZIO ecosystem integration for seamless ZIO application developmentldbc-authentication-plugin: Pure Scala3 MySQL authentication plugins (Clear Password)ldbc-aws-authentication-plugin: AWS Aurora IAM authentication support
⚠️ Deprecated Modules
ldbc-hikari: Officially deprecated in favor of built-in connection pooling
🔧 Breaking Changes
ldbc-hikarideprecation: Package is deprecated and will be removed in future versions- Migration required: Applications using
ldbc-hikarishould migrate to built-in connection pooling - Binary compatibility: Not binary compatible with prior versions, but source compatibility maintained for most user code
- API changes: Some minor API adjustments for enhanced security and performance
✨ New Features
ZIO Ecosystem Support
Complete integration with the ZIO ecosystem for functional programming enthusiasts:
libraryDependencies += "io.github.takapi327" %% "ldbc-zio-interop" % "0.5.0"Example Usage:
import zio.*
import ldbc.zio.interop.*
import ldbc.connector.*
import ldbc.dsl.*
object Main extends ZIOAppDefault:
private val datasource = MySQLDataSource
.build[Task]("127.0.0.1", 3306, "ldbc")
.setPassword("password")
.setDatabase("world")
override def run =
for
connection <- datasource.getConnection
connector = Connector.fromConnection(connection)
result <- sql"SELECT 1".query[Int].to[List].readOnly(connector)
yield resultEnhanced Authentication Plugins
Pure Scala3 authentication plugins provide enhanced security and cross-platform compatibility.
MySQL Clear Password Authentication
import ldbc.connector.*
import ldbc.authentication.plugin.*
val datasource = MySQLDataSource
.build[IO]("localhost", 3306, "cleartext-user")
.setPassword("plaintext-password")
.setDatabase("mydb")
.setSSL(SSL.Trusted) // Required for security
.setDefaultAuthenticationPlugin(MysqlClearPasswordPlugin)AWS Aurora IAM Authentication
import ldbc.amazon.plugin.AwsIamAuthenticationPlugin
import ldbc.connector.*
val hostname = "aurora-instance.cluster-xxx.region.rds.amazonaws.com"
val username = "iam-user"
val config = MySQLConfig.default
.setHost(hostname)
.setUser(username)
.setDatabase("mydb")
.setSSL(SSL.Trusted)
val plugin = AwsIamAuthenticationPlugin.default[IO]("ap-northeast-1", hostname, username)
MySQLDataSource.pooling[IO](config, plugins = List(plugin)).use { datasource =>
val connector = Connector.fromDataSource(datasource)
// Execute queries
}File-Based Query Execution
Execute SQL scripts and migrations directly from files with the new updateRaws method:
import ldbc.dsl.*
import fs2.io.file.{Files, Path}
import fs2.text
private def readFile(filename: String): IO[String] =
Files[IO]
.readAll(Path(filename))
.through(text.utf8.decode)
.compile
.string
for
sql <- readFile("migration.sql")
_ <- DBIO.updateRaws(sql).commit(connector)
yield ()🔒 Security Enhancements
Enhanced SQL Parameter Escaping
Improved string parameter escaping provides stronger protection against SQL injection attacks.
⚡ Performance Optimizations
Maximum Packet Size Configuration
Better compatibility with MySQL server's max_allowed_packet setting:
val datasource = MySQLDataSource
.build[IO]("localhost", 3306, "user")
.setPassword("password")
.setDatabase("mydb")
.setMaxPacketSize(16777216) // 16MB (match MySQL server configuration)Connection Pool Concurrency Improvements
Enhanced connection pool state management with atomic checks for improved stability in concurrent environments.
🚀 Features
- ZIO Support by @takapi327 in #562
- Feature/2025 11 added update raw by @takapi327 in #583
- Feature/2025 11 added my sql clear passowrd by @takapi327 in #592
- Feature/2025 11 create aws iam authentication plugin by @takapi327 in #593
- Feature/2025 12 create ldbc auth plugin by @takapi327 in #598
🔧 Refactoring
- Refactor/2025 09 issues 552 by @takapi327 in #556
- Refactor/2025 09 delete connection provider by @takapi327 in #557
- Refactor/2025 10 delete unused by @takapi327 in #564
- Dependencies/2025 10 scala3 compiler 3.3.7 by @takapi327 in #571
- Delete unused by @takapi327 in #578
- Refactor/2025 10 change active to mima check by @takapi327 in #581
- Refactor/2025 12 ssrf attack vulnerability response by @takapi327 in #600
- Added escape for string parameter by @takapi327 in #601
- Added force connection close in pool by @takapi327 in #602
- Refactor/2025 12 concurrent bag atomic check by @takapi327 in #603
- 🚀 Add configurable maxAllowedPacket setting for enhanced security and performance control by @takapi327 in #604
- Delete deprecated Provider by @takapi327 in #609
📖 Documentation
- Update README by @takapi327 in #566
- Refactor/2025 10 update version by @takapi327 in #579
- Documentation/2025 12 update document for 0.5.x by @takapi327 in #608
⛓️ Dependency update
- Update sbt-jmh from 0.4.7 to 0.4.8 by @scala-steward in #555
- Update logback-classic from 1.5.18 to 1.5.19 by @scala-steward in #558
- Update circe-generic from 0.14.14 to 0.14.15 by @scala-steward in #559
- Update otel4s-core-trace, otel4s-oteljava from 0.13.1 to 0.13.2 by @scala-steward in #560
- Update sbt, scripted-plugin from 1.11.6 to 1.11.7 by @scala-steward in #563
- Update sbt-typelevel, sbt-typelevel-site from 0.8.0 to 0.8.2 by @scala-steward in #565
- Update opentelemetry-exporter-otlp, ... from 1.54.1 to 1.55.0 by @scala-steward in #567
- Update scalafmt-core from 3.9.10 to 3.10.0 by @scala-steward in #569
- Update logback-classic from 1.5.19 to 1.5.20 by @scala-steward in #572
- Update scalafmt-core from 3.10.0 to 3.10.1 by @scala-steward in #573
- Update otel4s-core-trace, otel4s-oteljava from 0.13.2 to 0.14.0 by @scala-steward in #574
- Update sbt-scoverage from 2.3.1 to 2.4.0 by @scala-steward in #575
- Update http4s-circe, http4s-dsl, ... from 0.23.32 to 0.23.33 by @scala-steward in #582
- Update sbt-scoverage from 2.4.0 to 2.4.1 by @scala-steward in #584
- Update opentelemetry-exporter-otlp, ... from 1.55.0 to 1.56.0 by @scala-steward in #585
- Update doobie-core from 1.0.0-RC10 to 1.0.0-RC11 by @scala-steward in #586
- Update logback-classic from 1.5.20 to 1.5.21 by @scala-steward in #587
- Update scala3-compiler, scala3-library, ... from 3.7.3 to 3.7.4 by @scala-steward in #588
- Update sbt-typelevel, sbt-typelevel-site from 0.8.2 to 0.8.3 by @scala-steward in #589
- Update sbt-scoverage from 2.4.1 to 2.4.2 by @scala-steward in #590
- Update opentelemetry-exporter-otlp, ... from 1.56.0 to 1.57.0 by @scala-steward in #594
- Update sbt-scoverage from 2.4.2 to 2.4.3 by @scala-steward in #595
- Update logback-classic from 1.5.21 to 1.5.22 by @scala-steward in #596
- Update sbt-boilerplate from 0.7.0 to 0.8.0 by @scala-steward in #597
- Update sbt-typelevel, sbt-typelevel-site from 0.8.3 to 0.8.4 by @sc...
v0.4.0
ldbc v0.4.0 is released. 🎉
This release includes new feature additions, enhancements to existing features, disruptive changes and much more.
Note
ldbc is pre-1.0 software and is still undergoing active development. New versions are not binary compatible with prior versions, although in most cases user code will be source compatible.
The major version will be the stable version.
What's Changed
This release adds high-performance built-in connection pooling to the Pure Scala MySQL Connector. This enables efficient connection management optimized for the fiber-based concurrency model of Cats Effect, without requiring external libraries such as HikariCP.
Key Changes:
- New MySQLDataSource API: A new API replacing ConnectionProvider
- Built-in Connection Pooling: Includes CircuitBreaker, adaptive sizing, and leak detection
- Connector API: A new pattern for DBIO execution
- Streaming Support: Efficiently process large data volumes using fs2.Stream
- Enhanced OpenTelemetry Integration: MySQL-specific attributes and span names
🎯 Migration Guide
From ConnectionProvider to MySQLDataSource
Old (0.3.x):
val provider = ConnectionProvider
.default[IO]("localhost", 3306, "root")
.setPassword("password")
.setDatabase("test")New (0.4.x):
val dataSource = MySQLDataSource
.build[IO]("localhost", 3306, "root")
.setPassword("password")
.setDatabase("test")
val connector = Connector.fromDataSource(dataSource)Using Connection Pooling
val pooledDataSource = MySQLDataSource.pooling[IO](
MySQLConfig.default
.setHost("localhost")
.setPort(3306)
.setUser("root")
.setPassword("password")
.setDatabase("test")
.setMinConnections(5)
.setMaxConnections(20)
)
pooledDataSource.use { pool =>
val connector = Connector.fromDataSource(pool)
// Execute DBIOs
}Stream Support
import fs2.Stream
import ldbc.dsl.*
// Stream large datasets efficiently
val stream: Stream[DBIO, City] =
sql"SELECT * FROM city"
.query[City]
.stream(fetchSize = 1000)⚠️ Important Notes
- Scala Native users: Connection pooling is not recommended on Scala Native due to single-threaded execution model
- Built-in connection pooling: External connection pool libraries (e.g., HikariCP) are no longer required
- Performance: The new connection pool is optimized for Cats Effect's fiber-based concurrency model
🚀 Features
- Feature/2025 06 added stream support by @takapi327 in #500
- Additional Stream Support by @takapi327 in #509
- Feature/2025 08 create data source by @takapi327 in #520
- Add connection pooling support for pure Scala MySQL connector by @takapi327 in #523
- Add date functions such as YEAR, MONTH, and DAY to Statement by @takapi327 in #371
💪 Enhancement
- Enhancement/2025 09 support jdk25 by @takapi327 in #540
- Enhancement/2025 09 added pool status reporter by @takapi327 in #542
🪲 Bug Fixes
- Bug/2025 07 fix codec decode error by @takapi327 in #515
🔧 Refactoring
- Discontinuation of the SchemaSPY project by @takapi327 in #435
- Deletion omission response by @takapi327 in #436
- Refactor/2025 04 discontinuation of the core project by @takapi327 in #437
- Feature/2025 07 create connector by @takapi327 in #512
- Refactor/2025 07 replace logger by @takapi327 in #514
- Refactor/2025 09 delete unused property by @takapi327 in #544
- Adapt OpenTelemetry trace information to MySQL specifications by @takapi327 in #546
- Refactor/2025 09 added tracer property by @takapi327 in #549
📖 Documentation
- Updates to the Performance Document by @takapi327 in #535
- Documentation/2025 09 update use new syntax by @takapi327 in #537
- Update select mapping document by @takapi327 in #545
- Documentation/2025 09 update mcp document by @takapi327 in #550
⛓️ Dependency update
- Feature/2025 06 option method by @takapi327 in #503
- Feature/2025 06 nel method by @takapi327 in #504
- Update typesafe:config from 1.4.4 to 1.4.5 by @scala-steward in #533
- Update scala3-compiler, scala3-library, ... from 3.7.2 to 3.7.3 by @scala-steward in #534
- Enhancement/2025 09 test reinforcement by @takapi327 in #539
- Update scalafmt-core from 3.9.9 to 3.9.10 by @scala-steward in #538
- Update opentelemetry-exporter-otlp, ... from 1.54.0 to 1.54.1 by @scala-steward in #541
- Delete schemaspy dependencies by @takapi327 in #551
Full Changelog: v0.3.3...v0.4.0
v0.3.3
ldbc v0.3.3 is released.
This release updates dependencies.
Note
ldbc is pre-1.0 software and is still undergoing active development. New versions are not binary compatible with prior versions, although in most cases user code will be source compatible.
The major version will be the stable version.
What's Changed
⛓️ Dependency update
- Update circe-yaml from 0.16.0 to 0.16.1 by @scala-steward in #497
- Update circe-generic from 0.14.13 to 0.14.14 by @scala-steward in #496
- Update scalafmt-core from 3.9.7 to 3.9.8 by @scala-steward in #501
- Update specs2-core, specs2-junit from 5.6.3 to 5.6.4 by @scala-steward in #502
- Update sbt, scripted-plugin from 1.11.2 to 1.11.3 by @scala-steward in #505
- Update cats-effect from 3.6.1 to 3.6.2 by @scala-steward in #506
- Update typesafe:config from 1.4.3 to 1.4.4 by @scala-steward in #508
- Update opentelemetry-exporter-otlp, ... from 1.51.0 to 1.52.0 by @scala-steward in #510
- Update doobie-core from 1.0.0-RC9 to 1.0.0-RC10 by @scala-steward in #511
- Update otel4s-core-trace, otel4s-oteljava from 0.12.0 to 0.13.1 by @scala-steward in #507
- Update HikariCP from 6.3.0 to 6.3.1 by @scala-steward in #513
- Update HikariCP from 6.3.1 to 6.3.2 by @scala-steward in #516
- Update cats-effect from 3.6.2 to 3.6.3 by @scala-steward in #517
- Update scalafmt-core from 3.9.8 to 3.9.9 by @scala-steward in #522
- Update scala3-compiler, scala3-library, ... from 3.7.1 to 3.7.2 by @scala-steward in #521
- Update opentelemetry-exporter-otlp, ... from 1.52.0 to 1.53.0 by @scala-steward in #525
- Update sbt, scripted-plugin from 1.11.3 to 1.11.4 by @scala-steward in #526
- Update HikariCP from 6.3.2 to 7.0.1 by @scala-steward in #524
- Update HikariCP from 7.0.1 to 7.0.2 by @scala-steward in #527
- Update sbt, scripted-plugin from 1.11.4 to 1.11.5 by @scala-steward in #528
- Update fs2-core, fs2-io from 3.12.0 to 3.12.2 by @scala-steward in #529
- Update sbt-scalajs, scalajs-library_2.13, ... from 1.19.0 to 1.20.1 by @scala-steward in #531
- Update opentelemetry-exporter-otlp, ... from 1.53.0 to 1.54.0 by @scala-steward in #530
- Update sbt, scripted-plugin from 1.11.5 to 1.11.6 by @scala-steward in #532
Full Changelog: v0.3.2...v0.3.3
v0.3.2
ldbc v0.3.2 is released.
This release fixes bugs and updates dependencies.
Note
ldbc is pre-1.0 software and is still undergoing active development. New versions are not binary compatible with prior versions, although in most cases user code will be source compatible.
The major version will be the stable version.
What's Changed
Errors occur when null is returned when using a non-Option type to convert to another type.
Here, a String is returned as null, which causes a NullPointerException to be generated by the split process.
given Code[List[String]] = Codec[String].imap(_.split(","))(_.mkString("",))This error has been corrected so that custom type definitions will not raise an exception.
🪲 Bug Fixes
- Bug/2025 06 fix codec decode error by @takapi327 in #495
⛓️ Dependency update
- Update sbt, scripted-plugin from 1.11.1 to 1.11.2 by @scala-steward in #494
Full Changelog: v0.3.1...v0.3.2
v0.3.1
ldbc v0.3.1 is released.
This release fixes some bugs and updates some dependencies.
In addition, test coverage using Codecov has been introduced and additional missing tests have been implemented accordingly.
Note
ldbc is pre-1.0 software and is still undergoing active development. New versions are not binary compatible with prior versions, although in most cases user code will be source compatible.
The major version will be the stable version.
What's Changed
💪 Enhancement
- Enhancement/2025 05 correction of coverages testing omissions by @takapi327 in #474
- Enhancement/2025 05 correction of coverages testing omissions by @takapi327 in #475
- Enhancement/2025 05 correction of coverages testing omissions by @takapi327 in #477
- Enhancement/2025 05 correction of coverages testing omissions by @takapi327 in #479
- Enhancement/2025 05 correction of coverages testing omissions by @takapi327 in #480
- Enhancement/2025 05 used nix by @takapi327 in #481
🪲 Bug Fixes
- Bugfix/2025 06 Fix jdbc connector by @takapi327 in #488
🔧 Refactoring
- Refactor/2025-05 Update test by @takapi327 in #478
- Update 0.3.0 -> 0.3.1 by @takapi327 in #493
⛓️ Dependency update
- Update slick from 3.6.0 to 3.6.1 by @scala-steward in #476
- Update sbt, scripted-plugin from 1.10.11 to 1.11.0 by @scala-steward in #482
- Update scalafmt-core from 3.9.6 to 3.9.7 by @scala-steward in #483
- Update sbt, scripted-plugin from 1.11.0 to 1.11.1 by @scala-steward in #484
- Depenedencies/2025 06 update scala by @takapi327 in #487
- Dependencies/2025 06 fix sbt plugin by @takapi327 in #492
- Update opentelemetry-exporter-otlp, ... from 1.50.0 to 1.51.0 by @scala-steward in #490
Full Changelog: v0.3.0...v0.3.1
v0.3.0
ldbc v0.3.0 is released. 🎉
This release includes new feature additions, enhancements to existing features, disruptive changes and much more.
Note
ldbc is pre-1.0 software and is still undergoing active development. New versions are not binary compatible with prior versions, although in most cases user code will be source compatible.
The major version will be the stable version.
What's Changed
This is the first official release of 0.3.x, which includes significant improvements over 0.2.x.
Please refer to Migration Notes (from 0.2.x to 0.3.x) for migration instructions.
ldbc is available on the JVM, Scala.js, and ScalaNative
Please refer to Tutorial and QA for more information on how to use it.
You can also use MCP Server to perform learning and code generation.
Additional features that were not in the RC version
Repealed original Enum definition and changed to use scala.reflect.Enum type.
This allows users to use pure Enum as is.
enum Status:
case Active, InActive
given Codec[Status] = Codec.derivedEnum[Status]
sql"SELECT 'Active'".query[Status].to[Option]The DataType definition in Schema is also simpler.
- enum Status extends ldbc.schema.model.Enum:
- case Active, InActive
- object Status extends EnumDataType[Status]
+ enum Status:
+ case Active, InActive
- ENUM[Status](using Status).queryString
+ ENUM[Status].queryString
// "ENUM('Active','InActive') NOT NULL"Add support for NamedTuple, which became an official feature in Scala 3.7.
Adding this support will allow users to use NamedTuple directly.
for
(user, order) <- sql"SELECT u.*, o.* FROM `user` AS u JOIN `order` AS o ON u.id = o.user_id".query[(user: User, order: Order)].unsafe
users <- sql"SELECT id, name, email FROM `user`".query[(id: Long, name: String, email: String)].to[List]
yield
println(s"Result User: $user")
println(s"Result Order: $order")
users.foreach { user =>
println(s"User ID: ${user.id}, Name: ${user.name}, Email: ${user.email}")
}
// Result User: User(1,Alice,alice@example.com,2025-05-20T03:22:09,2025-05-20T03:22:09)
// Result Order: Order(1,1,1,2025-05-20T03:22:09,1,2025-05-20T03:22:09,2025-05-20T03:22:09)
// User ID: 1, Name: Alice, Email: alice@example.com
// User ID: 2, Name: Bob, Email: bob@example.com
// User ID: 3, Name: Charlie, Email: charlie@example.com🚀 Features
- Preparation for the start of multi-platform support by @takapi327 in #122
- Feature/2024 01 create ldbc connector project by @takapi327 in #128
- Feature/2024 02 create statement for select query by @takapi327 in #134
- Feature/2024 03 create prepared statement by @takapi327 in #158
- Feature/2024 03 Create Server PreparedStatement by @takapi327 in #159
- Feature/2024 03 create transaction by @takapi327 in #167
- Feature/2024 03 create savepoint by @takapi327 in #173
- Feature/2024 03 add com statistics utility commands by @takapi327 in #180
- Feature/2024 04 add com ping utility commands by @takapi327 in #181
- Feature/2024 04 add com reset connection utility commands by @takapi327 in #182
- Feature/2024 04 add com set option utility commands by @takapi327 in #183
- Feature/2024 04 add com change user utility commands by @takapi327 in #184
- Feature/2024 04 add batch command by @takapi327 in #186
- Feature/2024 04 create callable statement by @takapi327 in #205
- Feature/2024 05 create sql string context by @takapi327 in #211
- Feature/2024 05 convenience method added for sql construction by @takapi327 in #214
- Feature/2024 05 create jdbc connector package by @takapi327 in #215
- Feature/2024 05 create query helper by @takapi327 in #220
- Feature/2024 06 create static parameter by @takapi327 in #223
- Feature/2024 07 enhanced error handling by @takapi327 in #245
- Feature/2024 12 create codec by @takapi327 in #349
- Feature/2025 02 add before after process by @takapi327 in #393
- Add a function to perform a single operation on the DBIO by @takapi327 in #395
- Feature/2025 03 create provider by @takapi327 in #412
- Create npmPublish yml by @takapi327 in #458
💪 Enhancement
- Enhancement/2024 01 add sbt header plugin by @takapi327 in #120
- Enhancement/2024 01 multi platform support for core project by @takapi327 in #123
- Enhancement/2024 01 multi platform support for sql project by @takapi327 in #124
- Enhancement/2024 01 multi platform support for query builder project by @takapi327 in #125
- Enhancement/2024 01 multi platform support for codegen project by @takapi327 in #126
- Update issue templates by @takapi327 in #154
- Enhancement/2024 03 add database login by @takapi327 in #157
- Enhancement/2024 05 add large update by @takapi327 in #219
- Enhancement/2024 07 sql exception message enhancement by @takapi327 in #246
- Enhancement/2024 07 performance improvements by @takapi327 in #256
- Enhancement/2024 08 use scala native config brew by @takapi327 in #275
- Enhancement/2024 07 additional benchmarks by @takapi327 in #257
- Enhancement/2024 09 raises er...
v0.3.0-RC2
ldbc v0.3.0-RC2 is released.
This release includes new feature additions, enhancements to existing features, disruptive changes and much more.
Note
ldbc is pre-1.0 software and is still undergoing active development. New versions are not binary compatible with prior versions, although in most cases user code will be source compatible.
The major version will be the stable version.
SSL Extensions
Added functionality for SSL connections using certificates on various platforms.
For JVM
def fromKeyStoreResource(
resource: String,
storePassword: Array[Char],
keyPassword: Array[Char]
): SSLSample code
ConnectionProvider
.default[IO]("127.0.0.1", 3306, "user", "password", "database")
.setSSL(SSL.fromKeyStoreResource("keystore.jks", "password".toCharArray, "password".toCharArray))
.use { conn =>
???
}For JS
def fromSecureContext(
secureContext: SecureContext
): SSLSample code
for
ca <- Files[IO].readAll(Path("path/to/ca.pem")).through(text.utf8.decode).compile.string
secureContext = SecureContext(
ca = List(ca.asRight).some,
cert = None,
key = None
)
result <- ConnectionProvider
.default[IO]("127.0.0.1", 3306, "user", "password", "database")
.setSSL(SSL.fromSecureContext(secureContext))
.user { conn => ??? }
yield resultFor Native
def fromS2nConfig(
config: S2nConfig
): SSLSample code
for
ca <- Resource.eval(Files[IO].readAll(Path("path/to/ca.pem")).through(text.utf8.decode).compile.string)
cfg <- S2nConfig.builder.withPemsToTrustStore(List(ca)).build[IO]
connection <- ConnectionProvider
.default[IO]("127.0.0.1", 3306, "user", "password", "database")
.setSSL(SSL.fromS2nConfig(cfg))
.createConnection()
yield connectionfrom this ldbc supports all TLS modes provided by fs2. Below is a list of available SSL modes:
| Mode | Platform | Details |
|---|---|---|
| SSL.None | JVM/JS/Native | ldbc will not request SSL. This is the default. |
| SSL.Trusted | JVM/JS/Native | Connect via SSL and trust all certificates. Use this if you're running with a self-signed certificate, for instance. |
| SSL.System | JVM/JS/Native | Connect via SSL and use the system default SSLContext to verify certificates. Use this if you're running with a CA-signed certificate. |
| SSL.fromSSLContext(…) | JVM | Connect via SSL using an existing SSLContext. |
| SSL.fromKeyStoreFile(…) | JVM | Connect via SSL using a specified keystore file. |
| SSL.fromKeyStoreResource(…) | JVM | Connect via SSL using a specified keystore classpath resource. |
| SSL.fromKeyStore(…) | JVM | Connect via SSL using an existing Keystore. |
| SSL.fromSecureContext(...) | JS | Connect via SSL using an existing SecureContext. |
| SSL.fromS2nConfig(...) | Native | Connect via SSL using an existing S2nConfig. |
Documentation for LLMs
Documentation for llms has been created and published.
Currently, we have the following root-level files...
- /llms.txt — a listing of the available files
- /llms-full.txt — complete documentation for ldbc
- /llms-small.txt — compressed documentation for use with smaller context windows
What's Changed
💪 Enhancement
- Enhancement/2025 04 added ssl test by @takapi327 in #452
🔧 Refactoring
- Refactor/2025 4 minor correction by @takapi327 in #444
- Refactor/2025 4 update copyright year by @takapi327 in #445
📖 Documentation
- Added llms txt link by @takapi327 in #449
⛓️ Dependency update
- Update cats-effect from 3.5.7 to 3.6.0 by @scala-steward in #429
- Update munit-cats-effect from 2.0.0 to 2.1.0 by @scala-steward in #430
- Update/fs2 core 3.12.0 by @takapi327 in #432
- Update logback-classic from 1.5.16 to 1.5.18 by @scala-steward in #425
- Update HikariCP from 6.2.1 to 6.3.0 by @scala-steward in #428
- Dependencies/2025 04 update otel4s by @takapi327 in #433
- Update specs2-core, specs2-junit from 5.5.8 to 5.6.0 by @scala-steward in #441
- Update opentelemetry-exporter-otlp, ... from 1.48.0 to 1.49.0 by @scala-steward in #440
- Update cats-effect from 3.6.0 to 3.6.1 by @scala-steward in #442
- Update specs2-core, specs2-junit from 5.6.0 to 5.6.1 by @scala-steward in #447
- Documentation/2025 04 added zio usage by @takapi327 in #450
- Update specs2-core, specs2-junit from 5.6.1 to 5.6.2 by @scala-steward in #448
- Update doobie-core from 1.0.0-RC8 to 1.0.0-RC9 by @scala-steward in #451
- Update circe-generic from 0.14.12 to 0.14.13 by @scala-steward in #453
- Update sbt-scalajs, scalajs-library_2.13, ... from 1.18.2 to 1.19.0 by @scala-steward in #454
Full Changelog: v0.3.0-RC1...v0.3.0-RC2
v0.3.0-RC1
ldbc v0.3.0-RC1 is released.
This release includes new feature additions, enhancements to existing features, disruptive changes and much more.
Note
ldbc is pre-1.0 software and is still undergoing active development. New versions are not binary compatible with prior versions, although in most cases user code will be source compatible.
The major version will be the stable version.
Significant performance improvements
The most significant feature of this release is the significant improvement in performance.
Previous versions were clearly degraded compared to jdbc in terms of read processing.
After making several improvements, performance was greatly improved and benchmark results exceeded jdbc.
Before
After
What's Changed
Added withBeforeAfter function
Added functionality to connectors to allow processing to be added after the connection to the database is created and before it is destroyed.
This functionality can be achieved by using the withBeforeAfter method for Connection generation.
The second argument of withBeforeAfter specifies the type of the result of processing Before to be passed to After.
Connection.withBeforeAfter[IO, Unit](
...,
before = _ => IO.unit,
after = (_, _) => IO.unit
)This feature allows users to build connections where common processing takes place.
For example, it is possible to build a table and then delete it.
def before(connection: Connection[IO]): IO[Int] =
DBIO.sequence(user.schema.create).commit(connection) *>
IO(user.schema.create.statements.length)
def after(length: Int, connection: Connection[IO]): IO[Unit] =
DBIO.sequence(user.schema.drop).commit(connection) *>
IO.println(s"Created $length tables and dropped them")
Connection.withBeforeAfter[IO, Unit](
...,
before,
after
)Change implicit handover of LogHandler
LogHandler was passed implicitly to various DBIO functions.
However, this is redundant because a LogHandler must always be provided at the point where DBIO is executed.
There are few requests to change the LogHandler for each process, and it should be sufficient to simply use the LogHandler once it has been set up for the first time.
Therefore, the LogHandler can be set at the time of connection creation so that a common LogHandler can be used for each connection.
-given LogHandler[IO] = ???
Connection[IO](
+ logHandler = ???
)Changed create Connection to Provider
Changed the connection creation method to use Provider when using either ldbc or jdbc connectors.
ldbc
The ldbc Provider is constructed by passing mandatory properties such as host, port, and user.
ConnectionProvider.default[IO]("127.0.0.1", 13306, "ldbc")Additional settings are set using the setXXX method. The following sets additional password and database values.
ConnectionProvider
.default[IO]("127.0.0.1", 13306, "ldbc")
.setPassword("password")
.setDatabase("world")ldbc connections can be set to any process before or after, using withBeforeAfter when using Provider.
val before = ???
val after = ???
ConnectionProvider
.default[IO]("127.0.0.1", 13306, "ldbc")
.withBeforeAfter(before, after)jdbc
Note that jdbc creates connections based on a DataSource, and that a DB-specific execution context must be specified when creating a connection from a DataSource.
val ds = new MysqlDataSource()
ConnectionProvider.fromDataSource(ds, ExecutionContexts.synchronous)Methods such as fromConnection and fromDriverManager are provided as well as creation from DataSource.
Usage
The Provider can use the connection with use.
provider.use { connection =>
???
}The use uses the Resource internally and disconnects the connection when it is finished.
It is also possible to use a connection wrapped in Resource using createConnection.
provider.createConnection().use { connection =>
???
}Change DBIO to Free Monad
DBIO was converted to Free Monad using Cats. This removed the Effect Type from DBIO.
This eliminates the need for syntax, etc. to provide extended methods. This eliminates the need for users to write multiple imports when using a function.
Using dsl
- import ldbc.dsl.io.*
+ import ldbc.dsl.*Using query builder
- import ldbc.query.builder.io.*
+ import ldbc.query.builder.*Using schema
- import ldbc.schema.io.*
+ import ldbc.schema.*Elimination of automatic derivation
Prevents implicit auto-derivation, since recurrent generation by auto-derivation will cause compile speed to explode.
Automatic derivation should be provided under packages that can be quickly identified as affecting compile speed, rather than eliminated.
import ldbc.dsl.codec.auto.generic.toSlowCompile.given🚀 Features
- Feature/2025 02 add before after process by @takapi327 in #393
- Add a function to perform a single operation on the DBIO by @takapi327 in #395
- Feature/2025 03 create provider by @takapi327 in #412
💪 Enhancement
- Enhancement/2025 03 support jdk21 by @takapi327 in #408
🔧 Refactoring
- Added Connection alias by @takapi327 in #397
- Refactor/2025 02 replace log handler by @takapi327 in #398
- Refactor/2025 02 issues 404 by @takapi327 in #401
- Refactor/2025 03 issues 405 by @takapi327 in #411
- Correction of log output method by @takapi327 in #418
- Refactor/2025 03 dbio to free monad by @takapi327 in #419
- Added getter method by @takapi327 in #420
- Rename MySQLProvider -> ConnectionProvider by @takapi327 in #421
- Change automatic Codec derivation to optional processing by @takapi327 in #423
📖 Documentation
- Document/2024 10 update document for v0.3 by @takapi327 in #317
- Enhancement/2025 03 added llms txt by @takapi327 in #424
⛓️ Dependency update
- Update scalafmt-core from 3.9.0 to 3.9.1 by @scala-steward in #396
- Update doobie-core from 1.0.0-RC7 to 1.0.0-RC8 by @scala-steward in #400
- Update scalafmt-core from 3.9.1 to 3.9.2 by @scala-steward in #403
- Update scalafmt-core from 3.9.2 to 3.9.3 by @scala-steward in #409
- Update scala3-compiler, scala3-library, ... from 3.6.3 to 3.6.4 by @scala-steward in #410
- Update sbt, scripted-plugin from 1.10.7 to 1.10.10 by @scala-steward in #407
- Update scalafmt-core from 3.9.3 to 3.9.4 by @scala-steward in #414
- Update slick from 3.5.2 to 3.6.0 by @scala-steward in #413
- Update sbt, scripted-plugin from 1.10.10 to 1.10.11 by @scala-steward in #417
- Update circe-generic from 0.14.10 to 0.14.12 by @scala-steward in #416
Full Changelog: v0.3.0-beta11...v0.3.0-RC1
v0.3.0-beta11
ldbc v0.3.0-beta11 is released.
This release includes new feature additions, enhancements to existing features, disruptive changes and much more.
Note
ldbc is pre-1.0 software and is still undergoing active development. New versions are not binary compatible with prior versions, although in most cases user code will be source compatible.
The major version will be the stable version.
What's Changed
Adding DataType Column
If a column was to be given a data type or other setting, it had to be passed as an argument.
With this correction, it is now possible to define columns with data type characteristics.
In this method of definition, the column name can be used as the variable name, so it is no longer necessary to pass the column name as an argument.
class EntityTable extends Table[Entity]("entity"):
- def c1: Column[Long] = column[Long]("ci", BIGINT, AUTO_INCREMENT)
+ def c1: Column[Long] = bigint().autoIncrementColumn names can change their format by implicitly passing Naming.
The default is CamelCase, but to change this to PascalCase, do the following
class EntityTable extends Table[Entity]("entity"):
given Naming = Naming.PASCAL
def c1: Column[Long] = bigint().autoIncrementIf you want to change the format of a particular column, you can still define it by passing the column name as an argument.
class EntityTable extends Table[Entity]("entity"):
given Naming = Naming.PASCAL
def c1: Column[Long] = bigint().autoIncrement
def c2: Column[Long] = bigint("c_2")Adding DDL Schema
Add schema function to perform DDL.
class UserTable extends Table[User]("user"):
def id: Column[Long] = bigint().autoIncrement.primaryKey
def name: Column[String] = varchar(255)
def age: Column[Option[Int]] = int()
override def * : Column[User] = (id *: name *: age).to[User]
val userTable = TableQuery[UserTable]
connection
.use { conn =>
DBIO
.sequence(
userTable.schema.create,
userTable.schema.createIfNotExists,
userTable.schema.dropIfExists,
userTable.schema.create
)
.commit(conn)
}Schema can also be composed with other Schemas.
userTable.schema ++ userProfileTable.schemaSQL executable statements can be checked with the statements method.
userTable.schema.create.statements.foreach(println)
userTable.schema.createIfNotExists.statements.foreach(println)
userTable.schema.drop.statements.foreach(println)
userTable.schema.dropIfExists.statements.foreach(println)
userTable.schema.truncate.statements.foreach(println)Performance Improvement
The change was made because the method of splitting using splitAt was faster than using the helper functions provided by scodec.
Before
After
💪 Enhancement
- Enhancement/2025 01 survey performance by @takapi327 in #380
- Enhancement/2025 02 data type column by @takapi327 in #388
- Enhancement/2025 02 added ddl schema by @takapi327 in #389
- Enhancement/2025 02 added key by @takapi327 in #390
🪲 Bug Fixes
- Comparison operator compile error due to use of Opaque Type Alias by @takapi327 in #357
- Empty VALUES in INSERT statement causes error statement to be issued by @takapi327 in #362
- Passing an empty value in the IN clause of a WHERE statement throws an error by @takapi327 in #364
- Fixed a bug that caused incorrect statements to be issued when keys w… by @takapi327 in #392
🔧 Refactoring
- Update bug report template by @takapi327 in #358
- Changed to use export function instead of creating Alias Type. by @takapi327 in #372
- refactor(connector): use hashing from fs2 by @i10416 in #368
- tidy: use effect aware uuid gen by @i10416 in #373
- tidy: use weaker effect constraints by @i10416 in #374
- Fixed ResultSetImpl get value logic by @takapi327 in #381
- Enclose columns in bag quotation marks by @takapi327 in #385
- Refactor/2025 02 key settings by @takapi327 in #387
⛓️ Dependency update
- Update sbt-scalajs, scalajs-library_2.13, ... from 1.17.0 to 1.18.1 by @scala-steward in #361
- Update sbt-typelevel, sbt-typelevel-site from 0.7.5 to 0.7.6 by @scala-steward in #367
- Update scalafmt-core from 3.8.3 to 3.8.6 by @scala-steward in #376
- Update sbt-typelevel, sbt-typelevel-site from 0.7.6 to 0.7.7 by @scala-steward in #377
- Update scala3-compiler, scala3-library, ... from 3.6.2 to 3.6.3 by @scala-steward in #370
- Update scala3-compiler, scala3-library, ... from 3.3.4 to 3.3.5 by @scala-steward in #382
- Update sbt-scalajs, scalajs-library_2.13, ... from 1.18.1 to 1.18.2 by @scala-steward in #375
- Update doobie-core from 1.0.0-RC6 to 1.0.0-RC7 by @scala-steward in #386
- Update scalafmt-core from 3.8.6 to 3.9.0 by @scala-steward in #391
New Contributors
New @i10416 contributor. Thanks!
Full Changelog: v0.3.0-beta10...v0.3.0-beta11
v0.3.0-beta10
ldbc v0.3.0-beta10 is released.
This release includes new feature additions, enhancements to existing features, disruptive changes and much more.
Note
ldbc is pre-1.0 software and is still undergoing active development. New versions are not binary compatible with prior versions, although in most cases user code will be source compatible.
The major version will be the stable version.
What's Changed
Caution
This version is not compatible with the previous version, v0.3.0-beta9.
Adding Codec
If both Encoder and Decoder were needed, each had to be defined. With this modification, a new Codec has been added, allowing Encoder and Decoder to be defined together.
enum Status:
case Active, InActive
-given Encoder[Status] = Encoder[Boolean].contramap {
- case Status.Active => true
- case Status.InActive => false
-}
-given Decoder[Status] = Decoder[Boolean].map {
- case true => Status.Active
- case false => Status.InActive
-}
+given Codec[Status] = Codec[Boolean].imap {
+ case true => Status.Active
+ case false => Status.InActive
+} {
+ case Status.Active => true
+ case Status.InActive => false
+}It can also be used in place of a Decoder or Encoder by building a Codec.
-given Decoder[City] = (Decoder[Int] *: Decoder[String] *: Decoder[Int]).to[City]
+given Codec[City] = (Codec[Int] *: Codec[String] *: Codec[Int]).to[City]Modified to allow Either to be used during Decoder construction.
This allows us to construct a process for cases that do not match the following pattern
enum Status(val code: Int):
case InActive extends Status(0)
case Active extends Status(1)
given Decoder[Status] = Decoder[Int].emap {
case 0 => Right(Status.InActive)
case 1 => Right(Status.Active)
case unknown => Left(s"$unknown is Unknown Status code")
}This modification allows Codec to use Either for construction.
given Codec[Status] = Codec[Int].eimap {
case 0 => Right(Status.InActive)
case 1 => Right(Status.Active)
case unknown => Left(s"$unknown is Unknown Status code")
}(_.code)Additional function
Allow additional conditions of Where to be conditionally excluded
TableQuery[City]
.select(_.name)
.where(_.population > 1000000)
.and(_.name == "Tokyo", false)
// SELECT name FROM city WHERE population > ?Added a function to the Where conditional statement to determine whether to add a condition to the statement depending on the Option value.
val opt: Option[String] = ???
TableQuery[City]
.select(_.name)
.whereOpt(city => opt.map(value => city.name === value))
TableQuery[City]
.select(_.name)
.whereOpt(opt)((city, value) => city.name === value)💣 Breaking Change
Modification of Encoder and Decoder into a composable form using twiddles.
The method of building custom-type Decoders has changed. With this modification, Decoder can be converted to any type using the map function.
- given Decoder.Elem[Continent] = Decoder.Elem.mapping[String, Continent](str => Continent.valueOf(str.replace(" ", "_")))
+ given Decoder[Continent] = Decoder[String].map(str => Continent.valueOf(str.replace(" ", "_")))Decoder is still constructed implicitly.
case class City(id: Int, name: String, age: Int)
sql"SELECT id, name, age FROM city LIMIT 1"
.query[City]
.to[Option]
.readOnly(conn)However, implicit searches may fail if there are many properties in the model.
[error] |Implicit search problem too large.
[error] |an implicit search was terminated with failure after trying 100000 expressions.
[error] |The root candidate for the search was:
[error] |
[error] | given instance given_Decoder_P in object Decoder for ldbc.dsl.codec.Decoder[City]}In such cases, raising the search limit in the compilation options may resolve the problem.
scalacOptions += "-Ximplicit-search-limit:100000"However, it may lead to amplification of compilation time. In that case, it can also be resolved by manually building the Decoder as follows.
given Decoder[City] = (Decoder[Int] *: Decoder[String] *: Decoder[Int]).to[City]This is true not only for Decoder but also for Encoder.
Rename Executor to DBIO
The type used to represent IO to the DB was Executor, but this was changed because the DBIO type is more intuitive for the user.
- trait Executor[F[_]: Temporal, T]:
+ trait DBIO[F[_]: Temporal, T]:Migrate table name designation to derived
The method of specifying the table name using query builder has been changed from passing it as an argument of TableQuery to passing it as an argument of Table's derived.
Before
case class City(
id: Int,
name: String,
countryCode: String,
district: String,
population: Int
) derives Table
val table = TableQuery[Test]("city")After
case class City(
id: Int,
name: String,
countryCode: String,
district: String,
population: Int
)
object City:
given Table[City] = Table.derived[City]("city")Renewal of Schema project
This modification changes the way Table types are constructed using the Schema project.
Below we will look at the construction of the Table type corresponding to the User model.
case class User(
id: Long,
name: String,
age: Option[Int],
)Before
Until now, we had to create instances of Table directly; the arguments of Table had to be passed the corresponding columns in the same order as the properties possessed by the User class, and the data type of the columns had to be set as well, which was mandatory.
The TableQuery using this table type was implemented using Dynamic, which allows type-safe access, but the development tools could not do the completion.
This method of construction was also a bit slower in compile time than class generation
val userTable = Table[User]("user")(
column("id", BIGINT, AUTO_INCREMENT, PRIMARY_KEY),
column("name", VARCHAR(255)),
column("age", INT.UNSIGNED.DEFAULT(None)),
) After
In this modification, Table type generation has been changed to a method of creating a class by extending Table. In addition, the data type of a column is no longer required, but can be set arbitrarily by the implementer.
This change to a construction method similar to that of Slick has made it more familiar to implementers.
class UserTable extends Table[User]("user"):
def id: Column[Long] = column[Long]("id")
def name: Column[String] = column[String]("name")
def age: Column[Option[Int]] = column[Option[Int]]("age")
override def * : Column[User] = (id *: name *: age).to[User]The data type of the columns can still be set. This setting is used, for example, when generating a schema using this table class.
class UserTable extends Table[User]("user"):
def id: Column[Long] = column[Long]("id", BIGINT, AUTO_INCREMENT, PRIMARY_KEY)
def name: Column[String] = column[String]("name", VARCHAR(255))
def age: Column[Option[Int]] = column[Option[Int]]("age", INT.UNSIGNED.DEFAULT(None))
override def * : Column[User] = (id *: name *: age).to[User]🚀 Features
- Feature/2024 12 create codec by @takapi327 in #349
💪 Enhancement
- Enhancement/2024 12 added where condition by @takapi327 in #338
- Enhancement/2024 12 encoder extensions by @takapi327 in #347
- Enhancement/2024 12 make encoder and decoder compatible with twiddles by @takapi327 in #348
- Enhancement/2025 01 make decoder support either by @takapi327 in #350
- Enhancement/2025 01 update where statement by @takapi327 in #353
🪲 Bug Fixes
- Bugfix/2024 12 fixed single column insert by @takapi327 in #343
- Bugfix/2025 01 table query macro error by @takapi327 in #354
🔧 Refactoring
- Refactor/2024 12 schema migration by @takapi327 in #333
- Refactor/2024 12 change use schema by @takapi327 in #334
- Reafactor/2024 12 schema project replacement by @takapi327 in #336
- Refactor/2024 12 migrate table name designation to derived by @takapi327 in #337
- Fixed column comment by @takapi327 in #339
- Fixed Encoder fold use summonAll by @takapi327 in #340
- Refactor/2024 12 change test specs2 to scalatest by @takapi327 in #341
- Refactor/2024 12 fixed on duplicate key update parameter by @takapi327 in #342
- Refactor/2024 12 rename executor by @takapi327 in #344
- Refactor/2025 01 fixed dbio warning by @takapi327 in #351
- Refactor/2025 01 added import sort rule by @takapi327 in #355
⛓️ Dependency update
- Update HikariCP from 6.1.0 to 6.2.1 by @scala-steward in #327
- Update otel4s-core-trace from 0.11.1 to 0.11.2 by @scala-steward in #332
- Update sbt, scripted-plugin from 1.10.6 to 1.10.7 by @scala-steward in https://gith...