0

I am trying to install and setup Elasticsearch 8 on Centos 7 server, the service started but when I do curl -x GET HTTP://localhost:9200, or curl -x GET HTTP://127.0.0.1:9200 or curl -x GET HTTP://(local IP of the host):9200, the return is like this:

<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01//EN" "http://www.w3.org/TR/html4/strict.dtd">
<html><head>
<meta type="copyright" content="Copyright (C) 1996-2016 The Squid Software Foundation and contributors">
<meta http-equiv="Content-Type" content="text/html; charset=utf-8">
<title>ERROR: The requested URL could not be retrieved</title>
<style type="text/css"><!--
 /*
 * Copyright (C) 1996-2016 The Squid Software Foundation and contributors
 *
 * Squid software is distributed under GPLv2+ license and includes
 * contributions from numerous individuals and organizations.
 * Please see the COPYING and CONTRIBUTORS files for details.
 */

/* Stylesheet for Squid Error pages Adapted from design by Free CSS Templates http://www.freecsstemplates.org Released for free under a Creative Commons Attribution 2.5 License */

/* Page basics */

  • { font-family: verdana, sans-serif;

}

html body { margin: 0; padding: 0; background: #efefef; font-size: 12px; color: #1e1e1e; }

/* Page displayed title area */ #titles { margin-left: 15px; padding: 10px; padding-left: 100px; background: url('/squid-internal-static/icons/SN.png') no-repeat left; }

/* initial title */ #titles h1 { color: #000000; } #titles h2 { color: #000000; }

/* special event: FTP success page titles */ #titles ftpsuccess { background-color:#00ff00; width:100%; }

/* Page displayed body content area */ #content { padding: 10px; background: #ffffff; }

/* General text */ p { }

/* error brief description */ #error p { }

/* some data which may have caused the problem */ #data { }

/* the error message received from the system or other software */ #sysmsg { }

pre { font-family:sans-serif; }

/* special event: FTP / Gopher directory listing */ #dirmsg { font-family: courier; color: black; font-size: 10pt; } #dirlisting { margin-left: 2%; margin-right: 2%; } #dirlisting tr.entry td.icon,td.filename,td.size,td.date { border-bottom: groove; } #dirlisting td.size { width: 50px; text-align: right; padding-right: 5px; }

/* horizontal lines */ hr { margin: 0; }

/* page displayed footer area */ #footer { font-size: 9px; padding-left: 10px; }

body :lang(fa) { direction: rtl; font-size: 100%; font-family: Tahoma, Roya, sans-serif; float: right; } :lang(he) { direction: rtl; } --></style> </head><body id=ERR_ACCESS_DENIED> <div id="titles"> <h1>ERROR</h1> <h2>The requested URL could not be retrieved</h2> </div> <hr>

<div id="content"> <p>The following error was encountered while trying to retrieve the URL: <a href="http://127.0.0.1:9200/">http://127.0.0.1:9200/</a></p>

<blockquote id="error"> <p><b>Access Denied.</b></p> </blockquote>

<p>Access control configuration prevents your request from being allowed at this time. Please contact your service provider if you feel this is incorrect.</p>

<p>Your cache administrator is <a href="mailto:?subject=CacheErrorInfo%20-%20ERR_ACCESS_DENIED&amp;body=CacheHost%3A%20frontend1%0D%0AErrPage%3A%20ERR_ACCESS_DENIED%0D%0AErr%3A%20%5Bnone%5D%0D%0ATimeStamp%3A%20Fri,%2001%20Jul%202022%2013%3A14%3A40%20GMT%0D%0A%0D%0AClientIP%3A%20176.119.254.121%0D%0A%0D%0AHTTP%20Request%3A%0D%0AGET%20%2F%20HTTP%2F1.1%0AUser-Agent%3A%20curl%2F7.29.0%0D%0AAccept%3A%20%2F%0D%0AProxy-Connection%3A%20Keep-Alive%0D%0AHost%3A%20127.0.0.1%3A9200%0D%0A%0D%0A%0D%0A">netadmin@birzeit.edu</a>.</p> <br> </div>

<hr> <div id="footer"> <p>Generated Fri, 01 Jul 2022 13:14:40 GMT by frontend1 (squid/3.5.20)</p> <!-- ERR_ACCESS_DENIED --> </div> </body></html>

In elastisearch.yml Uncomment the following lines and change the value for each line as below.

#network.host:(Your IP address)
#http.port: 9200
#node.name: node-1 (or preferred name)
#cluster.initial_master_nodes: node-1 (or preferred name).

The firewall service is disabled.

The squid file configuration :

#
# Recommended minimum configuration:
#

Example rule allowing access from your local networks.

Adapt to list your (internal) IP networks from where browsing

should be allowed

acl localnet src 10.0.0.0/8 # RFC1918 possible internal network acl localnet src 172.16.0.0/12 # RFC1918 possible internal network acl localnet src 192.168.0.0/16 # RFC1918 possible internal network acl localnet src fc00::/7 # RFC 4193 local private network range acl localnet src fe80::/10 # RFC 4291 link-local (directly plugged) machines

acl SSL_ports port 443 acl Safe_ports port 80 # http acl Safe_ports port 21 # ftp acl Safe_ports port 443 # https acl Safe_ports port 70 # gopher acl Safe_ports port 210 # wais acl Safe_ports port 1025-65535 # unregistered ports acl Safe_ports port 280 # http-mgmt acl Safe_ports port 488 # gss-http acl Safe_ports port 591 # filemaker acl Safe_ports port 777 # multiling http acl CONNECT method CONNECT

Recommended minimum Access Permission configuration:

Deny requests to certain unsafe ports

http_access deny !Safe_ports

Deny CONNECT to other than secure SSL ports

http_access deny CONNECT !SSL_ports

Only allow cachemgr access from localhost

http_access allow localhost manager http_access deny manager

We strongly recommend the following be uncommented to protect innocent

web applications running on the proxy server who think the only

one who can access services on "localhost" is a local user

#http_access deny to_localhost

INSERT YOUR OWN RULE(S) HERE TO ALLOW ACCESS FROM YOUR CLIENTS

Example rule allowing access from your local networks.

Adapt localnet in the ACL section to list your (internal) IP networks

from where browsing should be allowed

http_access allow localnet http_access allow localhost

And finally deny all other access to this proxy

http_access deny all

Squid normally listens to port 3128

http_port 3128

Uncomment and adjust the following to add a disk cache directory.

#cache_dir ufs /var/spool/squid 100 16 256

Leave coredumps in the first cache dir

coredump_dir /var/spool/squid

Add any of your own refresh_pattern entries above these.

refresh_pattern ^ftp: 1440 20% 10080 refresh_pattern ^gopher: 1440 0% 1440 refresh_pattern -i (/cgi-bin/|?) 0 0% 0 refresh_pattern . 0 20% 4320

How can I solve this problem and get Elastisearch work?

Mai
  • 1

1 Answers1

1

The fact that your curl command returns a Squid error message suggests one of two things:

  1. either Squid and not elasticsearch listens to port 9200. You can check with sudo netstat -tnlp | grep :9200 or sudo ss -tnlp | grep :9200 to which process listens to port 9200

  2. alternatively you have configured curl with http proxy settings and you didn't add localhost, 127.0.0.1 and the systems own IP-address(es) and hostname(s) to the proxy exclude list.
    Unless you're debugging a proxy server you don't use a proxy to connect to services that you can connect to directly.
    Add the -v switch to your curl command see for example this answer for an example of the signs that a proxy server is used by curl.
    Set export no_proxy=internal.example.com,127.0.0.1,localhost to prevent curl from using the proxy for requests to those hosts.

Rob
  • 1,263
  • 3
  • 8