SLBC setup with one FortiController-5103B (Expert)

Facebooktwittergoogle_plusredditpinterestlinkedinFacebooktwittergoogle_plusredditpinterestlinkedin

This example describes the basics of setting up a Session-aware Load Balancing Cluster (SLBC) that consists of one FortiController-5103B, installed in chassis slot 1, and three FortiGate-5001C workers, installed in chassis slots 3, 4, and 5. This SLBC configuration can have up to eight 10Gbit network connections.

For more information about SLBC go here.

1. Hardware setup

Install a FortiGate-5000 series chassis and connect it to power. Install the FortiController in slot 1. Install the workers in slots 3, 4, and 5. Power on the chassis.

Check the chassis, FortiController, and FortiGate LEDs to verify that all components are operating normally. (To check normal operation LED status see the FortiGate-5000 series documents available here.)

Check the FortiSwitch-ATCA release notes and install the latest supported firmware on the FortiController and on the workers. Get FortiController firmware from the Fortinet Support site. Select the FortiSwitch-ATCA product.

2. Configuring the FortiController

Connect to the FortiController GUI (using HTTPS) or CLI (using SSH) with the default IP address (http://192.168.1.99) or connect to the FortiController CLI through the console port (Bits per second: 9600, Data bits: 8, Parity: None, Stop bits: 1, Flow control: None). Login using the admin administrator account and no password.

Add a password for the admin administrator account. From the GUI use the Administrators widget or from the CLI enter this command.

  config admin user
    edit admin
       set password <password>
    end

Change the FortiController mgmt interface IP address. From the GUI use the Management Port widget or from the CLI enter this command.

  config system interface
    edit mgmt
       set ip 172.20.120.151/24
    end

If you need to add a default route for the management IP address, enter this command.

  config route static
    edit route 1
        set gateway 172.20.120.2
    end

Set the chassis type that you are using.

 config system global
    set chassis-type fortigate-5140
 end

Go to Load Balance > Config to add the workers to the cluster by selecting Edit and moving the slots that contain workers to the Members list.

The Config page shows the slots in which the cluster expects to find workers. Since the workers have not been configured yet their status is Down.

Configure the External Management IP/Netmask. Once you have connected workers to the cluster, you can use this IP address to manage and configure them.

2b-config

 

 

You can also enter the following CLI command to add slots 3, 4, and 5 to the cluster:

config load-balance setting
    config slots
      edit 3
      next
      edit 4
      next
      edit 5
      end
   end

You can also enter the following CLI command to configure the external management IP/Netmask and management access to this address:

config load-balance setting
    set base-mgmt-external-ip 172.20.120.100 255.255.255.0
    set base-mgmt-allowaccess https ssh ping
 end
  

3. Adding the workers

Enter this command to reset the workers to factory default settings.

If the workers are going to run FortiOS Carrier, add the FortiOS Carrier license instead. This will reset the worker to factory default settings.

 execute factoryreset
Register each worker and apply licenses to each worker before adding the workers to the cluster. This includes FortiCloud activation and FortiClient licensing, and entering a license key if you purchased more than 10 Virtual Domains (VDOMs). You can also install any third-party certificates on the primary worker before forming the cluster. Once the cluster is formed third-party certificates are synchronized to all of the workers. FortiToken licenses can be added at any time because they are synchronized to all of the workers.  

Log into the CLI of each worker and enter this CLI command to set the worker to operate in FortiController mode.

 config system elbc
    set mode forticontroller
 end

The worker restarts and joins the cluster. On the FortiController GUI go to Load Balance > Status. As the workers restart they should appear in their appropriate slots.

The worker in the lowest slot number usually becomes the primary unit.

 

4. Results

You can now manage the workers in the same way as you would manage a standalone FortiGate. You can connect to the worker GUI or CLI using the External Management IP. If you had configured the worker mgmt1 or mgmt2 interfaces you can also connect to one of these addresses to manage the cluster.

To operate the cluster, connect networks to the FortiController front panel interfaces and connect to a worker GUI or CLI  to configure the workers to process the traffic they receive. When you connect to the External Management IP you connect to the primary worker. When you make configuration changes they are synchronized to all workers in the cluster.

By default on the workers, all FortiController front panel interfaces are in the root VDOM. You can configure the root VDOM or create additional VDOMs and move interfaces into them.

For example, you could connect the Internet to FortiController front panel interface 4 (fctrl/f4 on the worker GUI and CLI) and an internal network to FortiController front panel interface 2 (fctrl/f2 on the worker GUI and CLI) . Then enter the root VDOM and add a policy to allow users on the Internal network to access the Internet.  

For further reading, check out the FortiController Session-aware Load Balancing Guide.

Bill Dickie

Our Fearless Documentation Leader at Fortinet
After completing a science degree at the University of Waterloo, Bill began his professional life teaching college chemistry in Corner Brook, Newfoundland and fell into technical writing after moving to Ottawa in the mid '80s. Tech writing stints at all sorts of companies finally led to joining Fortinet to write the first FortiGate-300 Administration Guide.
  • Was this helpful?
  • Yes   No