Time for version 2… https://github.com/m0nkeyplay/TenableIO

The ch ch changes:

Some changes are big. There are now 3 main scripts and I have scrapped the need to download separately. This should allow for setting up the two switch based scripts easier as a cron job or scheduled task. This was first done when made I the interactive script. I moved it to the switch ones once I saw it worked well. Time helps make things better (I hope).

Some changes are small. Spelling, grammar, and consistency between scripts has been checked and updated.

Each file has a ReadMe in the docs/ folder. Below is a brief overview.


Search the scans by answering some questions. This one is good for one off reports. We all get them don’t we? Check it out.


Use the optional switches to download queue up and download what you need. This was originally ioSearchScansQueue3.py until the download was rolled into it. Check it out below.


So, you really want to download a lot of data and have the bandwidth and time to do it? This one is for you. It’s a modified search and download where you provide the scan and output type it will get the Critical, High, Medium and Scan Info data for you. I did not provide a walk through because it will just take a really long time.

In my last post(Consistency {code and APIs}) I was working out how to get data that was available into a tool I was working on. Working with the team at the vendor we were able to push some API improvements to make it all work out.

I’m happy to say I was able to put together a scripted tool that can be used in house at my job as well as for anyone else who is using Tenable.io.

Many people use a free Nessus scanner to check for vulnerabilities. Many companies use Tenable’s Security Center on premises and like all things, it’s moving to the cloud in Tenable.io.

Moving to a new platform brings about challenges. But, that is what I am here for, the challenges. The new platform is not as mature as the ones it is based off of. Data is robust in the new platform and pulling it into a data management tool has been… let’s stick with the word challenging.

How do things work?

Scans run on IO. Data is there. Someone needs to see it to act on it.

This should be very simple. Nothing is simple. Let the fun begin!

What is the goal?

Give the remediation team the data they need to get to work!

What do they need most to get this done?

For us: it’s hostname, pluginid, vulnerability plugin names, risk factor and compliance names. Your mileage may vary, and since the tool is free to download and use, you can update for your needs.

Let’s see this in action now (Hello World!).

Head on over to my TenableIO github repo to get some searching on. Clone the repo. ioSearchScansQueue3.py and ioExportDownload3.py are the scripts we are using. Fill in environment variables noted in the ReadMe and we are good to go.

A good test search is pluginid 19506 because that returns results on the scan itself.

python3 ioSearchScansQueue3.py -scan "Scan Name" -o csv -q pluginid -d 19506

Now pull down the results. Depending on the amount of data being asked for the report that IO writes can take some time, so I broke this out.

python3 ioExportDownload3.py

We can now see that IO has put together a handy spreadsheet of data for us to review, hand off, or do something else with.

Update — added interactive functionality.

Not everyone likes to remember switches, so I added an interactive option that works like a question/answer to get your searches on. Downloading of the data is also included in the script.

Expanding on the simple search

The -search option does not need to be one scan specific. If you have five scans with the name Vulnerability in it, a -search Vulnerability will provide results for all five of the scans.

As I noted above we can search for the following as a one shot or a list of each from a text file creating a much more robust report.

  • plugin id
  • plugin name
  • hostname
  • risk factor
  • Compliance name*

*Vulnerability and Compliance data is stored differently so searching on a plugin name will not give you a compliance result. See Consistency {code and APIs}.

Data can also be written up for download in the native .nessus format for import into any other tool.

And finally, because it’s out on Github for anyone to use, fork, fix – a user is not stuck only searching for what I say. The dictionary of plugins to search is there to update as needed. Just choose what it important to your team from the documentation and add as needed.

Don’t want to deal with all these fancy switches and just need to download scan data that needs attention on a schedule? I have one for you too. ioExportScanQueue3.py is what you are looking for. Queue these up in a batch job and the data is yours when you want it.

%python3 ioExportScanQueue3.py --scan "scan name" --type nessus or csv
%python3 ioExportDownload3.py

I’m hoping these scripts help others since many have written tools that help me.

I plan to keep the whole repo updated as I work more and more with IO and need to get data/repeat tasks.

Comments, questions, fixes, and pretzels are always welcome!

My job is to get people consistent data that they can rely on to make decisions that they tell me cost a lot of money. More precisely I (and so many others) do my job by hacking together solutions that vendors promise and rarely deliver in the glossy sales decks.

Over the past year I have spent a lot of time working with an external API. I’ve learned the beauty of being able to send calls and get data consistently. I’ve learned the limitations on my skills and work to improve them.

Consistency is important when working with an API. When the API says it will do x it’s pretty important that it doesn’t do y. That’s wholly different data. When the API is the only way to get the data because there is no built in functions for a user to get the data, or to import the data – I need to make the tool to do that.

That’s cool. That’s my job. That’s what I like to do. I like to hack things together to work. I like to solve problems that weren’t there until someone wanted something a little more from the program. These people thinking outside the box makes me think outside the box.

I write more and more code to do this. My code is not always the prettiest or most elegant. There are probably many other ways to do what I am writing, so I can’t hold all to such a high bar that I can’t reach.

What I can do though is ask, nay, say, that when providing an API be consistent.

API says it’s possible to search on a field. Oh, let’s say a description field.

We know the field is returned, because it works and is filled when we search on a pluginID in another script.

*This is probably a good time to note that part of the way I work when I am trying to add functionality to a script is take the working script, write a new one with some new functionality to prove it can work without breaking the first script – then merge. This is all happening on the second script I want to merge.

Consistency says we see a description field returned. We see the API documentation says it’s a searchable field. Searching it for data we see is returned in another search should return us results, just based on that field.

So, why doesn’t it?

Working with testing and support I come to learn there is not much consistency in the way the API is working.

The description field is referencing a reference field when searching a compliance audit rather than a vulnerability scan, which is not referenced in the documentation (the reference field or that it searches different fields and mushes them together for the final output). That’s a lot of references to what seems to me a big limitation of the documentation.

It may take a time or two to read the above paragraph. I understand.

What to do?

We can’t just be here – have an issue and not fix it? My data readers still need their data. I still need to get it to them.

I am happy to work with a McGyver watching support engineer who comes up with some pretty good ideas. Right now – I may have to end up using them based on turn around time in the past. Happy about it? Nope. But people want what they want and it’s my job to get it to them.

What have I learned?

I’ve learned that my code isn’t that bad and I was/am on the right track. There’s a bug, that needs to be quashed and I can’t do that. I’m pretty sure when I get this working someone, not just me will be happy.

On a final note – I’ve been questioning my ‘hacker’ cred as of late. Maybe it’s Twitter, maybe it’s walls that I run into. Then something like this comes along, and I remember why I do what I do, why people pay me to do what I do, and what I am doing is hacking these systems to do what people want them to do.

The scripts I am working on can be found at my gitHub repo… all sanitized for others who work to get software running as sold. When this one is working, it will be added to it. Hopefully sooner rather than later.