Tip: This series is for readers with a persistent urge to Go
Golang gets the HTTP request
1. Net packages are provided in the Golang standard library to handle network connections, creating HTTP requests via http.get and returning server response streams. ReadAll reads the entire response.
package main
import (
"fmt"
"io/ioutil"
"net/http"
"os"
)
func main(a) {
for _, arg := range os.Args[1:] {
res, err := http.Get(arg)
iferr ! =nil {
fmt.Println(err)
os.Exit(1)
}
b, err := ioutil.ReadAll(res.Body)
res.Body.Close()
iferr ! =nil {
fmt.Println(err)
os.Exit(1)
}
fmt.Printf("%s", b)
}
}
Copy the code
Take visiting 360 as an example
Timeouts are caught as errors
Second, the practice
1. The io.copy (DST, SRC) function reads from SRC and writes the read results to DST. Use this function instead of ioutil.ReadAll in this example to Copy the response structure to os.stdout, avoiding a buffer (b in this example) for storage. Remember to handle errors in the IO.Copy result.
package main
import (
"bufio"
"fmt"
"io"
"net/http"
"os"
)
func main(a) {
for _, arg := range os.Args[1:] {
res, err := http.Get(arg)
iferr ! =nil {
fmt.Println(err)
os.Exit(1)
}
out, err := os.Create("/tmp/buf_file2.txt")
// Initialize an IO.Writer
wt := bufio.NewWriter(out)
result, err := io.Copy(wt, res.Body)
defer res.Body.Close()
iferr ! =nil {
fmt.Println(err)
os.Exit(1)
}
fmt.Println(result)
wt.Flush()
}
}
Copy the code
2. If the url parameter is not prefixed with http://, add the prefix to the URL. You might use the strings.HasPrefix function.
Strings. Hasprefix(s, prefix) recognizes the beginning of a string. If the s string begins with prefix, it returns true; otherwise, false.
func HasPrefix(s, prefix string) bool {
return len(s) >= len(prefix) && s[0:len(prefix)] == prefix // By slicing
}
Copy the code
We just need to evaluate the parameters.
package main
import (
"bufio"
"fmt"
"io"
"net/http"
"os"
"strings"
)
func main(a) {
for _, arg := range os.Args[1:] {
if strings.HasPrefix(arg, "http://") {
fmt.Println(arg)
get_txt(arg)
} else {
arg = "http://" + arg
get_txt(arg)
fmt.Println(arg)
}
}
}
func get_txt(arg string) {
res, err := http.Get(arg)
iferr ! =nil {
fmt.Println(err)
os.Exit(1)
}
out, err := os.Create("/tmp/buf_file2.txt")
// Initialize an IO.Writer
wt := bufio.NewWriter(out)
result, err := io.Copy(wt, res.Body)
defer res.Body.Close()
iferr ! =nil {
fmt.Println(err)
os.Exit(1)
}
fmt.Println(result)
wt.Flush()
}
Copy the code
H Displays the HTTP Status code, which can be obtained from the resp.Status variable.
Add a get_code function like this, and replace get_txt with get_code in main:
func get_code(arg string) {
res, err := http.Get(arg)
iferr ! =nil {
fmt.Println(err)
os.Exit(1)
}
status := res.Status
fmt.Println("Http Code :", status)
}
Copy the code
The results are as follows:
3. First Taste of Golang Concurrency (Expansion)
The ease of creating highly concurrent applications is one of the hallmarks of GO. When we request multiple urls, we can also create concurrency through GorouTime (I understand it as coroutine) and pass the data in the coroutine through channels. Let’s create an application that requests all urls at once.
Create a new_routime(arg string, ch chan< -string goroutime, pass in the command line argument, the street. This function contains start_time and end_time to calculate how long the coroutine will run.
package main
import (
"bufio"
"fmt"
"io"
"io/ioutil"
"net/http"
"os"
"strings"
"time"
)
func get_txt(arg string) {
// ...
}
func get_code(arg string) {
// ...
}
func main(a) {
start := time.Now() // Record the start time
ch := make(chan string) // Create a character channel
for _, arg := range os.Args[1:] {
// Start the coroutine according to the argument
if strings.HasPrefix(arg, "http://") {
go new_routime(arg, ch)
} else {
arg = "http://" + arg
go new_routime(arg, ch)
}
}
// Read channel data
for range os.Args[1:] {
fmt.Println(<-ch)
}
end := time.Since(start).Seconds() // End time
fmt.Printf("time used :%.2fs\n", end)
}
func new_routime(arg string, ch chan<- string) {
start_time := time.Now()
res, err := http.Get(arg)
iferr ! =nil {
ch <- fmt.Sprintf("err:", err)
return
}
size_bytes, err := io.Copy(ioutil.Discard, res.Body)
res.Body.Close()
iferr ! =nil {
ch <- fmt.Sprintf("reading usl:%v", err)
return
}
end_time := time.Since(start_time).Seconds()
ch <- fmt.Sprintf("%.2fs %10d %s", end_time, size_bytes, arg) // Execution time, request size, URL
}
Copy the code
Sprintf outputs data to a string or channel object.
Ioutil.discard is a temporary garbage collector to which unconcerned data can be exported.
The result is as follows:
Book Reference: Go Language Bible
Feel free to point out any shortcomings in the comments section.
Favorites, likes and questions are welcome. Follow the top water cooler managers and do something other than pipe hot water.