EFW Support

Support => General Support => Topic started by: ufirst on Monday 10 January 2011, 11:04:20 pm



Title: Trouble with Black listing..
Post by: ufirst on Monday 10 January 2011, 11:04:20 pm
Hi i'm using endian  release 2.4.0 &
i'm try to block all the sites except specified 5 sites

first i create a new profile under Contentfilter
then i put that 5 allow sites to  "Allow the following sites"box under Custom black- and whitelists category

but when i testing all sites wear allowed to go
if u can plz try to resolve this problem.. thaks


Title: Re: Trouble with Black listing..
Post by: mrkroket on Tuesday 11 January 2011, 02:45:11 am
Forget content filter profile.

Just create two rules in HTTP Proxy->Access Policy:

First rule allow the 5 sites (starting with a dot, like .microsoft for any Microsoft subdomain, or the full name for an specified subdomain i.e. www.microsoft.com).
Second rule deny all.

That's all.


Title: Re: Trouble with Black listing..
Post by: ufirst on Tuesday 11 January 2011, 04:11:44 am
Forget content filter profile.

Just create two rules in HTTP Proxy->Access Policy:

First rule allow the 5 sites (starting with a dot, like .microsoft for any Microsoft subdomain, or the full name for an specified subdomain i.e. microsoft.com).
Second rule deny all.

That's all.

Thanks Dear mrkroket

I'll put that in to test & cme back to u..   :)


Title: Re: Trouble with Black listing..
Post by: mrkroket on Thursday 13 January 2011, 03:46:48 am
It's a simple rule.
First make sure that you enable the HTTP proxy and set up to transparent mode.

After that go to Proxy->HTTP->Access Policy.
Delete all rules if there is any.

Create a rule:
  Source Type: ANY
  Destination type: domain
   On insert Domains:  .microsoft.com
                                    .google.com
                                   etc etc
   (domains always with a starting dot)
Press Create Policy

Create a second rule:
  Source Type: ANY   
  Destination Type: ANY
  Access Policy: Deny Access
  Position: Last Position

The second one isn't really necessary, just to make sure you block all.
Save and test.


Title: Re: Trouble with Black listing..
Post by: TheEricHarris on Friday 14 January 2011, 10:41:17 am
I do this by creating a policy and putting ** in the blocked sites box.  Then I put the sites I want to allow in the allowed list.  Then create an access policy with the ip/network/mac and associate it with the policy you created in step1.


Title: Re: Trouble with Black listing..
Post by: ufirst on Friday 14 January 2011, 07:09:55 pm
It's a simple rule.
First make sure that you enable the HTTP proxy and set up to transparent mode.

After that go to Proxy->HTTP->Access Policy.
Delete all rules if there is any.

Create a rule:
  Source Type: ANY
  Destination type: domain
   On insert Domains:  .microsoft.com
                                    .google.com
                                   etc etc
   (domains always with a starting dot)
Press Create Policy

Create a second rule:
  Source Type: ANY   
  Destination Type: ANY
  Access Policy: Deny Access
  Position: Last Position

The second one isn't really necessary, just to make sure you block all.
Save and test.

Thanks Dear mrkroket

It's working great ....
 :) :) :) ;D ;D

i'm really appreciate ur help..


Title: Re: Trouble with Black listing..
Post by: mza122 on Friday 09 September 2011, 05:52:34 pm
Thanks mrkroket, your suggestion worked correctly for me but there is a problem that allowed websites not displaying images..it opening in only text mood and https sites are also not opening, its giving 403 forbidden error for https sites,
plz help


Title: Re: Trouble with Black listing..
Post by: mrkroket on Wednesday 14 September 2011, 12:25:10 am
On transparent proxy the HTTPS traffic doesn't go through the proxy, it uses the Outgoing Firewall rules.

So you can't filter HTTPS on a transparent proxy (it should be considered a man in the middle attack), you must create the appropiate firewall->Outgoing firewall rules to allow or block https traffic.

About the displaying images, usually modern websites uses 2 domains, one for active content (sending cookies) and another for static content. It saves traffic.
So check if the website aren't using external domains (usually amazon S3 or similar)