Prometheus通过http/https拉取target的源码分析

prometheus通过HTTP或者HTTPS拉取target的metrics信息。
alertmanager通过HTTP拉取:

http://10.233.96.91:9090/metrics//配置 scrape_configs: - job_name: monitoring/alertmanager/0 honor_timestamps: true scrape_interval: 30s scrape_timeout: 10s metrics_path: /metrics scheme: http ......

kube-state-metrics通过HTTPS拉取:
https://10.233.96.90:8443/metrics//配置 - job_name: monitoring/kube-state-metrics/0 honor_labels: true honor_timestamps: true scrape_interval: 30s scrape_timeout: 30s metrics_path: /metrics scheme: https bearer_token_file: /var/run/secrets/kubernetes.io/serviceaccount/token tls_config: insecure_skip_verify: true

【Prometheus通过http/https拉取target的源码分析】可以看到,对于https的target来说,配置:
  • scheme=https;
  • 增加了bearer_token_file的配置;
  • 增加了tls_config的配置;
初始化http client 构造client时,使用配置cfg.HTTPClientConfig:
//scrape/scrape.go func newScrapePool(cfg *config.ScrapeConfig, app storage.Appendable, jitterSeed uint64, logger log.Logger) (*scrapePool, error) { ...... client, err := config_util.NewClientFromConfig(cfg.HTTPClientConfig, cfg.JobName, false) .... sp := &scrapePool{ cancel:cancel, appendable:app, config:cfg, client:client, activeTargets: map[uint64]*Target{}, loops:map[uint64]loop{}, logger:logger, } ..... }

HTTPClientConfig结构带了bearToken和tlsconfig信息:
type HTTPClientConfig struct { // The HTTP basic authentication credentials for the targets. BasicAuth *BasicAuth `yaml:"basic_auth,omitempty"` // The bearer token for the targets. BearerToken Secret `yaml:"bearer_token,omitempty"` // The bearer token file for the targets. BearerTokenFile string `yaml:"bearer_token_file,omitempty"` // HTTP proxy server to use to connect to the targets. ProxyURL URL `yaml:"proxy_url,omitempty"` // TLSConfig to use to connect to the targets. TLSConfig TLSConfig `yaml:"tls_config,omitempty"` }

对于https的client,将tlsconfig和bearer token配置到client
// NewRoundTripperFromConfig returns a new HTTP RoundTripper configured for the // given config.HTTPClientConfig. The name is used as go-conntrack metric label. func NewRoundTripperFromConfig(cfg HTTPClientConfig, name string, disableKeepAlives bool) (http.RoundTripper, error) { newRT := func(tlsConfig *tls.Config) (http.RoundTripper, error) { // The only timeout we care about is the configured scrape timeout. // It is applied on request. So we leave out any timings here. var rt http.RoundTripper = &http.Transport{ Proxy:http.ProxyURL(cfg.ProxyURL.URL), MaxIdleConns:20000, MaxIdleConnsPerHost: 1000, // see https://github.com/golang/go/issues/13801 DisableKeepAlives:disableKeepAlives, TLSClientConfig:tlsConfig, DisableCompression:true, ...... } // TODO: use ForceAttemptHTTP2 when we move to Go 1.13+. err := http2.ConfigureTransport(rt.(*http.Transport))// If a bearer token is provided, create a round tripper that will set the // Authorization header correctly on each request. if len(cfg.BearerToken) > 0 { rt = NewBearerAuthRoundTripper(cfg.BearerToken, rt) } else if len(cfg.BearerTokenFile) > 0 { rt = NewBearerAuthFileRoundTripper(cfg.BearerTokenFile, rt) } ... // Return a new configured RoundTripper. return rt, nil }tlsConfig, err := NewTLSConfig(&cfg.TLSConfig) if err != nil { return nil, err } ... return newTLSRoundTripper(tlsConfig, cfg.TLSConfig.CAFile, newRT) }

对于bearer token,被添加到http的header中(Authorization: Bearer $token)
// NewBearerAuthRoundTripper adds the provided bearer token to a request unless the authorization // header has already been set. func NewBearerAuthRoundTripper(token Secret, rt http.RoundTripper) http.RoundTripper { return &bearerAuthRoundTripper{token, rt} }func (rt *bearerAuthRoundTripper) RoundTrip(req *http.Request) (*http.Response, error) { if len(req.Header.Get("Authorization")) == 0 { req = cloneRequest(req) req.Header.Set("Authorization", fmt.Sprintf("Bearer %s", string(rt.bearerToken))) } return rt.rt.RoundTrip(req) }

可以curl命令看一下:
curl -k https://172.160.230.217:8443/metrics -H "Authorization: Bearer eyJhbGciOiJSUzI1NiI...g"

使用http client拉取 构造好http client,拉取的过程就很简单了:HTTP GET /metrics
// scrape/scrape.go func (s *targetScraper) scrape(ctx context.Context, w io.Writer) (string, error) { if s.req == nil { req, err := http.NewRequest("GET", s.URL().String(), nil) if err != nil { return "", err } req.Header.Add("Accept", acceptHeader) req.Header.Add("Accept-Encoding", "gzip") req.Header.Set("User-Agent", userAgentHeader) req.Header.Set("X-Prometheus-Scrape-Timeout-Seconds", fmt.Sprintf("%f", s.timeout.Seconds()))s.req = req }resp, err := s.client.Do(s.req.WithContext(ctx)) ......if resp.Header.Get("Content-Encoding") != "gzip" { _, err = io.Copy(w, resp.Body) if err != nil { return "", err } return resp.Header.Get("Content-Type"), nil }..... return resp.Header.Get("Content-Type"), nil }

    推荐阅读