Scroll Down to find example code for saving data to an AWS S3 Bucket and also creating an AWS Lambda Function to manipulate the data.
Running Google Analytics slows down site performance because adding any script to our html head tag adds overhead to our app. This includes any third party libraries also added to the head tag.
I found a free jsonp geolocation api, pulled country, city and ip variables into my javascript, then passed these vars into a Lambda function and saved them as a json file to a simple AWS S3 bucket. This way I could avoid adding the google analytics to my head tag, thereby speeding up my performance a bit. I included the following code just before my end body tag.
<script type="text/javascript" src="https://ipinfo.io/json?callback=myCallback">
</script>
then my callback function looks like this
function myCallback(post){
session = Math.random().toString(36).substring(2, 15) + Math.random().toString(36).substring(2, 15); //added 11-11 default session
pg_name="whatever ur page name is";
country = post.country;
city =post.city;
ip_address = post.ip;
passVars(); //first run start script then run passvars javascript function to pass vars to Lambda
} //end myCallback
my passvars javascript function looks like this..note we fetch Lambda inside this function
function passVars() {
var date = new Date().toLocaleDateString('en-US');
var now = new Date().toLocaleTimeString();
var out = new Date().toLocaleTimeString();
//then pass 5 objects... ses, hit2 etc to lambda and call event.ses inside lambda..note, this object must be called article
const article = { ses: session, city2: city, hit2: date, date2:pg_name, ip2:ip_address,country2:country, time2:now, time3:out };
//call Lambda function here, this next line is the AWS api gateway call to the Lambda function
fetch("https://xxxx.execute-api.us-east-2.amazonaws.com/default/saveS3", {
method: "POST",
// Adding body or contents to send
body: JSON.stringify(article),
// Adding headers to the request
headers: {
"Content-type": "application/json; charset=UTF-8"
}
})
}
my Lambda function looks like this
const AWS = require('aws-sdk');
const fetch = require('node-fetch');
const s3 = new AWS.S3();
exports.handler = async (event) => {
//first fetch ur current json file from s3 bucket
//this handler function uses an event argument to pass data to json file saved in s3 bucket
const res = await fetch('https://xxxx.s3.us-east-2.amazonaws.com/tracker2.json');
const json = await res.json();
// then add a new record
json.push({
// get geo data from javascript, then pass geo variables into the event object
country: event.country2,
session: event.ses,
page_name: event.date2,
hit: event.hit2,
ip: event.ip2,
time_in: event.time2,
time_out: event.time3,
event_name: event.city2
});
//then re write the whole updated file back to s3 and it will overwrite existing json file, this is our only way of appending json data. It is similar to an SQL insert command
var params = {
Bucket: 'xxxx',
Key: 'tracker2.json',
Body: JSON.stringify(json), //pass fetch result into body with update from push
ContentType: 'json',
};
var s3Response = await s3.upload(params).promise();
};
My AWS S3 json file looks like this
[{"country":"Philippines","session":"j4w9nq38x4nl8v1fxy3ja","page_name":"Learn Java","hit":"11/22/2021","ip":"175.176.55.14","time_in":"7:52:35 PM","time_out":"7:52:35 PM"},{"country":"United States","session":"n0ja7c6z2vlec6tn7bd","page_name":"Learn Java","hit":"11/22/2021","ip":"66.249.66.130","time_in":"7:04:57 AM","time_out":"7:04:57 AM"}]
After passing variables to my S3 json file I can then render the result to html.
see the SQL section of my website for more at
more about rendering JSON file in S3 Bucket, click here then click SQL areaps: Also I added some listeners to record time spent on page. Just view sourcecode at my site for this
Happy coding folks!