1
Answer

Does queue STL increase its size itself? Like, when I do push_back in

Suri

Suri

3y
526
1

Does queue STL increase its size itself? Like, when I do push_back in vector it gets automatically resized.
Is it the same for queue? I heard that they have size_t though.

Answers (1)
0
Kumar AU

Kumar AU

1.4k 322 66.5k Nov 29

Hello Jayraj, Thank you so much for the response and your time which you have spent on this.

Actually as you seen in my original question, Table 1 and Table 2 will have foreign key relation (FileId column) which needs to be addressed And File contents can be more than 100K records , in this situtation , I am thinking to use DataTable and stored proc in PostresSQL for better formance while reading the entore file. 

As posted in my original question, could you please let me know how do i achive inserting data to postgres tables using DataTable in Go and also Datatable in postgresSQL. either direct inline query in GO lang or using stored procedure 

0
Jayraj Chhaya

Jayraj Chhaya

314 6k 93k Nov 29

Hi Kumar AU,

To parse the provided text file and populate the data into PostgreSQL using Go, you can follow these steps:

  1. Read the File: Use the os and bufio packages to read the file line by line.
  2. Parse the Data: Identify the row types based on the starting character and extract the necessary fields.
  3. Store in Data Structures: Use slices or structs to hold the parsed data temporarily.
  4. Insert into PostgreSQL: Utilize the database/sql package along with a PostgreSQL driver (like lib/pq) to insert the data into the respective tables.
package main

import (
    "bufio"
    "database/sql"
    "fmt"
    "os"
    _ "github.com/lib/pq"
)

type FileRecord struct {
    FileHeader     string
    BatchHeader    string
    BatchTrailer   string
    FileTrailer    string
}

type DetailRecord struct {
    FileId   int
    Number1  string
    Account  string
    Code     string
}

func main() {
    // Open the file
    file, err := os.Open("data.txt")
    if err != nil {
        fmt.Println(err)
        return
    }
    defer file.Close()

    var fileRecords []FileRecord
    var detailRecords []DetailRecord
    var fileId int

    scanner := bufio.NewScanner(file)
    for scanner.Scan() {
        line := scanner.Text()
        switch line[0] {
        case '1':
            // Parse file header
            fileRecords = append(fileRecords, FileRecord{FileHeader: line})
        case '5':
            // Parse batch header
            fileRecords[fileId].BatchHeader = line
        case '6':
            // Parse detail record
            detailRecords = append(detailRecords, DetailRecord{
                FileId:  fileId,
                Number1: line[5:10],
                Account: line[11:15],
            })
        case '7':
            // Parse additional record
            detailRecords[len(detailRecords)-1].Code = line[4:7]
        case '8':
            // Parse batch trailer
            fileRecords[fileId].BatchTrailer = line
            fileId++
        case '9':
            // Parse file trailer
            fileRecords[fileId-1].FileTrailer = line
        }
    }

    // Insert into PostgreSQL
    db, err := sql.Open("postgres", "user=username dbname=mydb sslmode=disable")
    if err != nil {
        fmt.Println(err)
        return
    }
    defer db.Close()

    for _, record := range fileRecords {
        _, err := db.Exec("INSERT INTO table1 (FileHeader, BatchHeader, BatchTrailer, FileTrailer) VALUES ($1, $2, $3, $4)", 
            record.FileHeader, record.BatchHeader, record.BatchTrailer, record.FileTrailer)
        if err != nil {
            fmt.Println(err)
        }
    }

    for _, detail := range detailRecords {
        _, err := db.Exec("INSERT INTO table2 (FileId, Number1, Account, Code) VALUES ($1, $2, $3, $4)", 
            detail.FileId, detail.Number1, detail.Account, detail.Code)
        if err != nil {
            fmt.Println(err)
        }
    }
}

This code provides a basic structure for reading the file, parsing the data, and inserting it into PostgreSQL. Adjust the database connection string and table names as necessary for your specific use case.