SPLK-1002 Splunk Core Certified Power User – Splunk Advanced Concepts Part 2
The next command which is one of the most important in Splunk is btool. Using btool, you can list almost any configuration in the Splunk instance. Let us go through some of the examples. To invoke btool, go to your Splunk home directory bin Splunk. You need to follow up with the command that is cmd. After cmd, we’ll name the utility name that is our B tool. So this is a Splunk command B Tool which is used for listing configuration and verifying this information of Splunk instance.
As part of our first scenario to understand btool, we’ll go through to see some of the syntax checks how we can perform on a B tool. So this is the common command that is used for verifying the syntax of your Splunk configuration. As of now, there is no syntax error, so btool reports it back saying nothing. Let us intentionally make a syntax error and notice how it will pinpoint the errors in the syntax of configuration file, so that whenever you are editing the configuration files of Splunk before restarting, make sure to run this command in order to verify your editing is as per Splunk syntax. For this example, I’ll go to inputs. com. In Inputs. com, I’ll try to make a mistake of syntax.
I’ll remove h from the host. As you can see, I’ll save this file. If I run this command, it will narrow down the exact error that it’s occurring on. Which file? As you can see, it says invalid key instance name default under this file on line number two, that is OS three. Of course it should be Host. I’ll change it back to Host and if I rerun the search, if I rerun this command again, it will not show any errors. So that anytime you make a modification to the changes using editing configuration files before restarting a Splunk, make sure to do a btool check.
One more important benefit of btool is you can list the configuration no matter where it is present in your Splunk instance. Let’s say few of the people edit configuration in Default or Local, although Default is not recommended. Let’s say some of them are editing or some of the team members are edited in Default or Apps default apps local system Local you don’t know where to look for configuration. btool will be your friend. So let’s say I need to find a field extraction for my access log.
I don’t know where it is defined. So what do I do? So what do I do? Hello, B Tool command using Splunk utility and I will list my props list and I’ll grip for my access log stanza as you can see I need to escape this, so escape the opening braces. So I had to enter two double escape characters. So it says there is access combined access with cookie access common. Let’s say I need to find my access common configuration of props so I can go back to my search or the command that I used to filter out.
This time I will run with a debug option so that it gives me the file name where it consists of this configuration. As you can see, all this configuration are as part of system default. In this way you can narrow down any configuration in your splunk using B two we’ll go through some of them, that is we have listed props. Let’s say I need to list all props extractions in my environment can hit enter it keeps on running until it lists out all the props configuration that are present in the Splunk instance. Similarly, you can list transforms. It lists all the transforms not only field extractions, you can list including inputs. These are the inputs as of now enabled. One udp, one tcp. Splunk tcp.
This will be little cumbersome to read that, but the same with the debug option will give you with filename. So that if you are interested in this udp input can just grab this file name, open it up and you’ll be able to narrow down this configuration. So this is the configuration. So in this fashion this will be highly useful in a large environment where the configuration will be distributed among apps. Local System local apps default system default instead of going through all the files, you can just narrow down using your splunk cli.
Even if you are new to some splunk environment, you can use this command in order to help you out. The similar way you can list your license server cluster, master outputs, even your knowledge objects like event types which we have mistyped event types which we have created as part of our previous tutorials. You can list tags, dashboards, everything for listing dashboards the commands will be views so all the dashboards are internally on the back end cli represented as views. As you can see, these are some of the views that as of now present in your Splunk instance.
These are the views as part of our Splunk instance. Here we’ll be able to see our demo dashboards which we have created a couple of videos earlier. As you can see, we have demo and we have copied the demo in order to demonstrate other features of Splunk in this fashion it will give you much better visibility and if I want to know where this dashboard is located, I can go for debug mode. So now this is the location where my dashboard is located. This way it will be very efficient and very helpful in order to understand the splunk environment.
These are some of the quick tips which as a splunk developer, splunk admin or a splunk architect which you can use in order to efficiently do your day to day activities. The first one will be restarting the process by the required need that is either splunk web or Splunk D. You can restart Splunk Web using this command. Similarly, if you don’t need to specify the complete command, you can just type Restart with double S. So this way it restarts only Splunk web you’ll be able to see it usually ask for a username and password, or else if you are already authenticated it, directly restarts your Splunk web interface.
This is like a shortcut. Instead of typing the complete command, this is one way to refresh your ui for changes, the other way is using your splunk web itself. So here after the url mentioned debug refresh as you can see the complete url is debug refresh with english us so this is your Splunk url followed by these items which gives you option to refresh your splunk configuration which includes your dashboards static content and also your props and transforms.
These are the items which are successfully reloaded into your splunk. So this includes a lot of them including your props and transforms. In this way you can narrow down which configuration you would like to refresh. So this is one of the examples where it shows using admin or this admin configuration files reload ui. In this way, if you specify this configuration, it reloads only specific configuration that is mentioned as part of the url. So this reduces the need for having a frequent restarts in your splunk environment.
In this video, we’ll be discussing about data models in splunk. Data Models are one of the data sets which are available in splunk. The data models can speed up your splunk with the cost of storage. All this data model can be accelerated to hold more information for a longer period of time time. Now, let us see how we can create these data models. In order to create these data models, click on Settings. Choose Data models under Data models. As you can see, as of now, we don’t have any data models. If you already have any data models which is built on your nonproduction splunk instances can download these data models and upload it in this console.
Since we do not have any other data model, we’ll go ahead and create our own. Here. I’ll be naming it as splunk Demo Data Model and let it be part of search and reporting app. Demonstration of Data Model these data models are widely used for creating charts visualization without the need for search processing language as a performance boost. It is also used by splunk admins in order to improve the search performance on your search ads. So this is the first menu as part of Data Model creation once you give the name of your data model. Now, the first step of creating a data model is to identify a root event, root search or a root transaction.
Because that is the first step in creating a data model. Let us say we need to create a data model for Windows field authentication and apache requests, let us add a root event. You can also add Root Search which includes complex search queries of splunk that is including pipe transactions. All these comments, but for simplicity, I’ll be adding Root Event. I’ll define this as demo route event. You can name this anything you want, but it is good to keep a track of reference of what this data set will do. The constraint here will be index is equal to mean where all our data is stored. I’ll click on Preview in order to see some of the sample events.
As we can see, we have the sample events. I’ll go ahead and save. Now in this root event, we have Windows and apache locks. Let me create a child event for this root event. In order to segregate Windows and apache, I’ll add a child. I’ll name this as Windows. I’ll add a filtering criteria of source type is equal to windstar so all the Windows lots will be under this child element. I’ll click on save. So here it is. Now I’ll create one more child for my access log. I’ll name this apache. Source type is equal to Access combined with Cookie. I click on save.
As you can see, now we have our two filters and as you can see, the search here will automatically have this two constraints. So this you can consider as base search for your apache locks. Or web server logs. Similarly, this will be your base search for your Windows log. And if you’d like to add any fields for your Windows logs or web server logs, you can add this by clicking on Add Field. Button which has these four options you can calculate a new field, you can look up any database or a lookup file, including kv store for any new fields. That is data enrichment.
Similarly, using regular expression and also you can add geolocation for any IP based fields. Let us go to our demo events. As you can see now we have just predefined default fields as part of our data model. Let us go and extract some of the fields.
So when you click on Root Event you can see an option Auto extracted that is under Add Fields. I’ll go ahead and click on Auto extracted. As you can see it has created these many fields from the recent logs so I’ll go ahead and select all these fields. Click on save. These fields will be added to your extractions so that whenever this data model runs, it runs across and finds all these fields across the index main and finds all these fields and it keeps a copy that is accelerated.
Summary whenever you search for the locks, it will be able to fetch and give you the results almost instantaneously. Now let us see how this works I’ll go to Windows, I’ll choose the pivot option in the data model creation we have not yet accelerated go to Windows. So here it is looking for all time. I’ll change it to last 30 days. In the last 30 days we have 14,000 Windows events. Let’s say I need to split the row column by account name.
I’ll just choose Account Name here and click on Enter so all the account names will be displayed. Similarly, if I need computer name, you can click on Computer Name. So these fields are Auto extracted as part of our data model creation. So here not only this, you’ll be able to create any number of reports and charts based on this. Let’s say I have the statistics I need to save this as a report or a dashboard.
You can go ahead and click on Save as you’ll get an option to save us report or Dashboard you can fill out this information click on Save. Similarly, you can also visualize this by picking any of the visualization you choose the visualization will be automatically populated. There are other options which you can choose depending on the data available so that the corresponding display will be displayed and you can save them as dashboard panel or your reports.
So this is one of the major advantages of having a data model or data set to create visualization and reporting without the knowledge of your splunk search language. Let us go back to our Edit Data model console. Let us accelerate the search before accelerating. I’ll share the permission click on Accelerate. I’ll give it a range of three months. The longer it is, the more time it takes to complete the acceleration.
Once you have enabled the acceleration, you can’t edit this dashboard at any time. If you want to edit this dashboard, you need to go back and disable the acceleration in order to edit this. Go back to all data models in order to check the status of data model, whether acceleration is completed or how much storage it has being used. You can go back to your data model name and click on this arrow to expand. As you can see that status is building and it has used the size of zero mb. As of now, we’ll wait for it to build.
Once you have completely understood how we can create data model, we can add root event, root search and how we can add additional fields into our data models. Now let us go ahead and see how we can use this data models in order to build a search without knowing any search queries. There are two methods.
You can either access them under Settings Data model or you can go to Search and Reporting or any other app which has data sets click on Data sets under Data set choose the data model I’ll choose apache which we have created earlier under apache we have these many fields in this data set let us go ahead and create a nice visualization without the need of knowing all the search results or the commands used to create the charts, visualization or reports go to manage Editing table. Once you click on editing table you will be redirected into a separate ui, which is used for extraction of fields, rearranging the fields, calculating some averages, all these typical search queries that are usually built by splunk admins.
Once you have this screen, as you can see here, it has all the fields that are extracted from our access logs. This access logs. Let’s say I need only client IP and count of those count action. Let us pick some of the fields rather than all. I’ll go ahead and I’ll filter. Some of the fields. This is the complete search query that is being run as of now. Choose the field, whichever it is you feel not necessary for this scenario. Edit. Click on delete. It will be deleted. Let’s say if you have a lot of fields which are unnecessary, you can go to Data sets. Choose the Data model you’d like to
remove some of the fields from. Here. Edit in Data Model once you’re in Data Model Editor you’ll be able to remove some of the fields overall from this data model, go to apache. Under apache, go to Bulk edit by selecting whichever the fields that are not necessary, we need Action client IP. We don’t need. Cookie and all this date fields because we have underscore time which will be used for extracting all this. We don’t need event type, field, file host, identity.
We don’t need most of these fields. We might require Jssion. We need method. We need status also. Go to Bulkheader. Once selected, we’ll mark this as hidden. Let us save this. Once it is saved, can reload this. As you can see, we have only which are the fields that we selected for.
So, once we have this, let us go to extend in table where we can create some tabular columns which can be used as part of our reports or alerts. So here, as you can see, I’ll add stats. So once you add stats, you’ll get the stats menu where you can split the events by time. I’ll split by client IP. So select the client IP, click apply. We will add one aggregator action that will be count values. Click on apply. As you can see, we populated a new statistics without the need of writing any search queries. So once we have created this, you can go ahead and save us a new table that will be either used as lookups or other data sets.
Hi. In this video we will be seeing how to deploy Universal Forwarder in a large environment. Let’s say we have more than 100 or 200 machines where we need to collect data that will be our data source for our Splunk environment. Let us see the best way to install these agents. In all the data sources for Windows we have a script where you need to customize one the Splunk installation file, that is the msi file which we have seen in the package downloads where and how to download these files and you need to set the name of the package. Make sure to replace this with the latest version, whichever you have downloaded. So this is for your 32 bit data sources which are running on Windows operating system. And this is for 64 bit operating systems which are running Windows OS.
And once you have made sure these two are set to write Path and the name of the installation, the next one is the deployment server. Make sure you provide the complete host name or the IP address of your deployment server from where these data sources or the Splunk agents will be communicating with your deployment servers in order to pull the configuration which need to be in place for these forwarders. Once you have made sure these three values are set, the next is the execution or the installation of the unison forwarder through our script. The installation command is using msi executable utility where we have the percentage plunk that is our installation package and we have made sure to agree to this plank license.
And during the installation itself we are specifying our deployment server host name and the port details so that once it is started up it can automatically report to our deployment server, pull the configuration, then we will be able to change any configuration related to these agents. So this is one way of doing it. The other way is if you want your Universal forwarded to run as separate user account, that is your service account for the Splunk universal forwarder to run on a Windows operating system for a privileged log collection. Let’s say we need to collect registry settings. So this command shows how to install Universal forwarder as a different user.
So you can choose either this one or this one. And this configuration is nothing but to make sure we have all the necessary loggings enabled on the Splunk and these will be in the Splunk etc. Log local cfg. And if you have customized any of the logging levels on the Splunk, you can make sure to apply the same logging levels to the Splunk. And once it is done the Splunk will be started up. After this command executes it will be displaying a message saying installation is successful and if any of these installation steps go wrong then it will be showing it as installation has failed.
So by this way we can push the script into gpu or any policy which you have pushed in your organization you can use this script to install across all the agents that are spread out across in your organization. This is for windows. Now let us see for Linux for Linux we have a bash script where we’ll be changing one the installation package similar to the windows one the other one will be your deployment server hostname or the IP address.
So here based on the oas that we are going to install for linux let’s say it is Aaax or normal linux or if it is in red ad based or simply we are installing a tar package. So based on this we can choose depending on the OS, the respective packages one if it is ax it will go for tgz package if it is linux or debian based system it will go for depth package and if it is ym that is centralized or redirect Linux it will go for rpm. During installation, make sure you download all this package and specify it in the location wherever you are providing it the source it might be your shared drive or a network storage where there is a common path between all the agents where they can download these files.
Once it has determined which OS or which package needs to be installed, the next is installation. If it is a debate based system, it will use the package if it is rpm based system it will be rpm. If it is a common any other Linux it will use a tar extraction. Once it has determined or the installation is complete, it will start the Splunk process by accepting the license. In our previous videos we have seen what this command does. It silently starts up your splunk without asking for any details.
Once your agent has been successfully started up, it will make sure to set the deploy pool to your deployment server so that the rest of the configuration can be pulled from your deployment server. And also we can make it run as a boot process so that whenever a system goes for a reboot and comes up back online, the Splunk service will also be back online along with other services. Once the deployment pool and the login configuration are copied the Splunk will be started back up.
After starting it will make sure all the privileges required for Splunk user that is our application account has been provided for opt splunk forwarder that is our agent installation home directory. So once you have installed agents with the universal forwarders on them, the default password is the admin change me. In order to change this we have a command. This command can copy and paste it as part of your installation script in Linux or Windows it should be able to work without any issues so that during the installation itself we will be changing the default password of Splunk.
Popular posts
Recent Posts