Replacing Google Analytics with cookie free Javascript Geo Tracking

Scroll Down to find example code for saving data to an AWS S3 Bucket and also creating an AWS Lambda Function to manipulate the data.

Running Google Analytics slows down site performance because adding any script to our html head tag adds overhead to our app. This includes any third party libraries also added to the head tag.

I found a free jsonp geolocation api, pulled country, city and ip variables into my javascript, then passed these vars into a Lambda function and saved them as a json file to a simple AWS S3 bucket. This way I could avoid adding the google analytics to my head tag, thereby speeding up my performance a bit. I included the following code just before my end body tag.

<script type="text/javascript" src="">


then my callback function looks like this

function myCallback(post){

session = Math.random().toString(36).substring(2, 15) + Math.random().toString(36).substring(2, 15); //added 11-11 default session

pg_name="whatever ur page name is";

country =;


ip_address = post.ip;

passVars(); //first run start script then run passvars javascript function to pass vars to Lambda

} //end myCallback

my passvars javascript function looks like this..note we fetch Lambda inside this function

function passVars() {

var date = new Date().toLocaleDateString('en-US');

var now = new Date().toLocaleTimeString();

var out = new Date().toLocaleTimeString();

//then pass 5 objects... ses, hit2 etc to lambda and call inside lambda..note, this object must be called article

const article = { ses: session, city2: city, hit2: date, date2:pg_name, ip2:ip_address,country2:country, time2:now, time3:out };

//call Lambda function here, this next line is the AWS api gateway call to the Lambda function

fetch("", {

method: "POST",

// Adding body or contents to send

body: JSON.stringify(article),

// Adding headers to the request

headers: {

"Content-type": "application/json; charset=UTF-8"




my Lambda function looks like this

const AWS = require('aws-sdk');

const fetch = require('node-fetch');

const s3 = new AWS.S3();

exports.handler = async (event) => {

//first fetch ur current json file from s3 bucket

//this handler function uses an event argument to pass data to json file saved in s3 bucket

const res = await fetch('');

const json = await res.json();

// then add a new record


// get geo data from javascript, then pass geo variables into the event object

country: event.country2,


page_name: event.date2,

hit: event.hit2,

ip: event.ip2,

time_in: event.time2,

time_out: event.time3,

event_name: event.city2


//then re write the whole updated file back to s3 and it will overwrite existing json file, this is our only way of appending json data. It is similar to an SQL insert command

var params = {

Bucket: 'xxxx',

Key: 'tracker2.json',

Body: JSON.stringify(json), //pass fetch result into body with update from push

ContentType: 'json',


var s3Response = await s3.upload(params).promise();


My AWS S3 json file looks like this

[{"country":"Philippines","session":"j4w9nq38x4nl8v1fxy3ja","page_name":"Learn Java","hit":"11/22/2021","ip":"","time_in":"7:52:35 PM","time_out":"7:52:35 PM"},{"country":"United States","session":"n0ja7c6z2vlec6tn7bd","page_name":"Learn Java","hit":"11/22/2021","ip":"","time_in":"7:04:57 AM","time_out":"7:04:57 AM"}]

After passing variables to my S3 json file I can then render the result to html.

see the SQL section of my website for more at

more about rendering JSON file in S3 Bucket, click here then click SQL area

ps: Also I added some listeners to record time spent on page. Just view sourcecode at my site for this

Happy coding folks!