Performance Bottleneck When Using GORM with Bulk Inserts in Go
I need help solving I'm optimizing some code but Quick question that's been bugging me - I've looked through the documentation and I'm still confused about I'm currently working with a performance scenario while using GORM to perform bulk inserts in my Go application. I have a scenario where I need to insert a large number of records (around 10,000) into a PostgreSQL database, but the operation is taking significantly longer than expected, around 30 seconds. I've done some profiling and found that the bottleneck seems to be in the way GORM handles the bulk insert. Here's a simplified version of the code I'm using: ```go type User struct { ID uint `gorm:"primaryKey"` Name string `gorm:"size:100"` Email string `gorm:"size:100;unique"` } var users []User for i := 0; i < 10000; i++ { users = append(users, User{Name: fmt.Sprintf("User%d", i), Email: fmt.Sprintf("user%d@example.com", i)}) } db.Create(&users) ``` I also tried using `db.Save(&users)` and `db.CreateInBatches(&users, 1000)`, but the performance didn't improve much. I noticed that GORM generates a massive SQL statement for all records, which might be causing the slowdown. I have the following configurations in my database connection: ```go dsn := "host=localhost user=myuser password=mypassword dbname=mydb port=5432 sslmode=disable" db, err := gorm.Open(postgres.Open(dsn), &gorm.Config{}) if err != nil { log.Fatal(err) } ``` Could there be a better way to handle bulk inserts with GORM for high performance? Are there any best practices or configurations I could adjust to speed up this operation? Any insights would be greatly appreciated! I'm working on a CLI tool that needs to handle this. Any help would be greatly appreciated! What am I doing wrong? I'm using Go 3.11 in this project. Hoping someone can shed some light on this. How would you solve this?