ChatGPT解决这个技术问题 Extra ChatGPT

How can I read a whole file into a string variable

I have lots of small files, I don't want to read them line by line.

Is there a function in Go that will read a whole file into a string variable?


T
Tim Cooper

Use ioutil.ReadFile:

func ReadFile(filename string) ([]byte, error)

ReadFile reads the file named by filename and returns the contents. A successful call returns err == nil, not err == EOF. Because ReadFile reads the whole file, it does not treat an EOF from Read as an error to be reported.

You will get a []byte instead of a string. It can be converted if really necessary:

s := string(buf)

Then for constructing the final string result, you can use append() to accumulate the data in a single byte slice as you read each file, then convert the accumulated byte slice to the final string result. Alternatively you might like bytes.Join.
Show us how to convert it then... The question doesn't ask for a byte array.
Using this to open a html file and I find a new line is appended after every line which is messing us some of my formatting. Is there any way to avoid that?
The golang is strongly typed. Conversion is really necessary if you need type string rather than []byte. Happily string(bytes) gets the job done.
Ioutil is deprecated now us os for file handling.
o
openwonk

If you just want the content as string, then the simple solution is to use the ReadFile function from the io/ioutil package. This function returns a slice of bytes which you can easily convert to a string.

Go 1.16 or later

Replace ioutil with os for this example.

package main

import (
    "fmt"
    "os"
)

func main() {
    b, err := os.ReadFile("file.txt") // just pass the file name
    if err != nil {
        fmt.Print(err)
    }

    fmt.Println(b) // print the content as 'bytes'

    str := string(b) // convert content to a 'string'

    fmt.Println(str) // print the content as a 'string'
}

Go 1.15 or earlier

package main

import (
    "fmt"
    "io/ioutil"
)

func main() {
    b, err := ioutil.ReadFile("file.txt") // just pass the file name
    if err != nil {
        fmt.Print(err)
    }

    fmt.Println(b) // print the content as 'bytes'

    str := string(b) // convert content to a 'string'

    fmt.Println(str) // print the content as a 'string'
}

Ioutil is deprecated now
Thanks @PirateKing. Updated answer using os package.
Thanks @openwonk, can you move the Go 1.16 example to the top of the answer to make sure it's noticed?
R
Running Wild

I think the best thing to do, if you're really concerned about the efficiency of concatenating all of these files, is to copy them all into the same bytes buffer.

buf := bytes.NewBuffer(nil)
for _, filename := range filenames {
  f, _ := os.Open(filename) // Error handling elided for brevity.
  io.Copy(buf, f)           // Error handling elided for brevity.
  f.Close()
}
s := string(buf.Bytes())

This opens each file, copies its contents into buf, then closes the file. Depending on your situation you may not actually need to convert it, the last line is just to show that buf.Bytes() has the data you're looking for.


Hi,will io.Copy overwrite buf's content ? And what's the capacity of buf ? Thanks.
Copy won't overwrite, it will just keep adding to buf, and buf will grow as much as it needs to accomodate the new data.
The buf has an "infinite" capacity. It will continue to expand as more data is added. ioutil.Readfile will allocate a buffer that is big enough to fit the complete file and not need to reallocate.
Does using a bytebuffer really improve performance compared to simply appending it to the slice(/array)? What about memory? How big is the difference?
R
Raghav Dinesh

This is how I did it:

package main

import (
  "fmt"
  "os"
  "bytes"
  "log"
)

func main() {
   filerc, err := os.Open("filename")
   if err != nil{
     log.Fatal(err)
   }
   defer filerc.Close()

   buf := new(bytes.Buffer)
   buf.ReadFrom(filerc)
   contents := buf.String()

   fmt.Print(contents) 

}    

While ioutil.ReadFile is more concise and thus is preferable when the file isn't already open, buf.ReadFrom() works in cases where you are already given an open file, so it is a good second answer.
N
Nimantha

You can use strings.Builder:

package main

import (
   "io"
   "os"
   "strings"
)

func main() {
   f, err := os.Open("file.txt")
   if err != nil {
      panic(err)
   }
   defer f.Close()
   b := new(strings.Builder)
   io.Copy(b, f)
   print(b.String())
}

Or if you don't mind []byte, you can use os.ReadFile:

package main
import "os"

func main() {
   b, err := os.ReadFile("file.txt")
   if err != nil {
      panic(err)
   }
   os.Stdout.Write(b)
}

As the @Cerise Limón's answer suggests: "You can improve the strings.Builder example by growing the builder to the size of the file before reading"; we could fi, _ := f.Stat() b.Grow(int(fi.Size())) before io.Copy(b, f)
f
fwhez

I'm not with computer,so I write a draft. You might be clear of what I say.

func main(){
    const dir = "/etc/"
    filesInfo, e := ioutil.ReadDir(dir)
    var fileNames = make([]string, 0, 10)
    for i,v:=range filesInfo{
        if !v.IsDir() {
            fileNames = append(fileNames, v.Name())
        }
    }

    var fileNumber = len(fileNames)
    var contents = make([]string, fileNumber, 10)
    wg := sync.WaitGroup{}
    wg.Add(fileNumber)

    for i,_:=range content {
        go func(i int){
            defer wg.Done()
            buf,e := ioutil.Readfile(fmt.Printf("%s/%s", dir, fileName[i]))
            defer file.Close()  
            content[i] = string(buf)
        }(i)   
    }
    wg.Wait()
}