Tuesday, November 21, 2017

AWS IoT Update – Better Value with New Pricing Model

Our customers are using AWS IoT to make their connected devices more intelligent. These devices collect & measure data in the field (below the ground, in the air, in the water, on factory floors and in hospital rooms) and use AWS IoT as their gateway to the AWS Cloud. Once connected to the cloud, customers can write device data to Amazon Simple Storage Service (S3) and Amazon DynamoDB, process data using Amazon Kinesis and AWS Lambda functions, initiate Amazon Simple Notification Service (SNS) push notifications, and much more.

New Pricing Model (20-40% Reduction)
Today we are making a change to the AWS IoT pricing model that will make it an even better value for you. Most customers will see a price reduction of 20-40%, with some receiving a significantly larger discount depending on their workload.

The original model was based on a charge for the number of messages that were sent to or from the service. This all-inclusive model was a good starting point, but also meant that some customers were effectively paying for parts of AWS IoT that they did not actually use. For example, some customers have devices that ping AWS IoT very frequently, with sparse rule sets that fire infrequently. Our new model is more fine-grained, with independent charges for each component (all prices are for devices that connect to the US East (Northern Virginia) Region):

Connectivity – Metered in 1 minute increments and based on the total time your devices are connected to AWS IoT. Priced at $0.08 per million minutes of connection (equivalent to $0.042 per device per year for 24/7 connectivity). Your devices can send keep-alive pings at 30 second to 20 minute intervals at no additional cost.

Messaging – Metered by the number of messages transmitted between your devices and AWS IoT. Pricing starts at $1 per million messages, with volume pricing falling as low as $0.70 per million. You may send and receive messages up to 128 kilobytes in size. Messages are metered in 5 kilobyte increments (up from 512 bytes previously). For example, an 8 kilobyte message is metered as two messages.

Rules Engine – Metered for each time a rule is triggered, and for the number of actions executed within a rule, with a minimum of one action per rule. Priced at $0.15 per million rules-triggered and $0.15 per million actions-executed. Rules that process a message in excess of 5 kilobytes are metered at the next multiple of the 5 kilobyte size. For example, a rule that processes an 8 kilobyte message is metered as two rules.

Device Shadow & Registry Updates – Metered on the number of operations to access or modify Device Shadow or Registry data, priced at $1.25 per million operations. Device Shadow and Registry operations are metered in 1 kilobyte increments of the Device Shadow or Registry record size. For example, an update to a 1.5 kilobyte Shadow record is metered as two operations.

The AWS Free Tier now offers a generous allocation of connection minutes, messages, triggered rules, rules actions, Shadow, and Registry usage, enough to operate a fleet of up to 50 devices. The new prices will take effect on January 1, 2018 with no effort on your part. At that time, the updated prices will be published on the AWS IoT Pricing page.

AWS IoT at re:Invent
We have an entire IoT track at this year’s AWS re:Invent. Here is a sampling:

We also have customer-led sessions from Philips, Panasonic, Enel, and Salesforce.

Jeff;



from AWS News Blog http://ift.tt/2zVX9So
via IFTTT

Low Poly T-Rex – 3D Printing Time-lapse #3DPrinting

from Low Poly T-Rex – 3D Printing Time-lapse #3DPrinting
by Pedro Ruiz

Every Tuesday we’ll 3D print designs from the community and showcase slicer settings, use cases and of course, Time-lapses!

Low Poly T-Rex
By: WONGLK519
http://ift.tt/2iCyoBd
BCN3D Sigma R17
Teal PLA
19hrs 19min
X:249 Y:295 Z:215mm
.15mm layer / .4mm nozzle
10% Infill / 4.5mm retract
230C / 0C
660G
50mm/s

Monday, November 20, 2017

A Review of Web Summit: Part 2 One Week On

from A Review of Web Summit: Part 2 One Week On
by Sarah O'Connell

The worn, smooth cobbles and rustic architecture of Lisbon’s streets were certainly an excellent backdrop for this year’s Web Summit: a contrast of old vs new; shining chrome structures and American accents somehow harmonising with flagstones and the soft, lilting Portuguese dialect. The Altice Arena, which hosted the event, erupted from the centre of a more modernised area of the city. If you haven’t visited this Summit in Lisbon before, you would never suspect that an event like this was hidden in amongst the meandering streets.

I was looking forward to getting to the event on its official opening day. After the first day that I spent inside watching the build of the stages, and among the glitzy promise of the Opening Night, sadly my first true ‘attendee’ experience of the event was in queuing for over an hour in the beating sun to get in through the one entrance to the arena. It seemed that all 59,500 attendees had arrived at once.

Not a great start.

There are always going to be queues in an event of this scale,  but having just one bottle-necked entrance felt ludicrous. Despite this, the crowd remained surprisingly upbeat, with people taking the opportunity to talk openly to the people around them in the queue. I saw more than one set of business cards being exchanged before we could even see the doors.

From a logistics point of view, there was also a complete difference in security checks dependent on what section of the entrance you ended up going through – some people were scanned through metal detectors, others weren’t. Some had their bags searched, had to remove coats and shoes – I just had to open mine and I was waved through – not comforting for an event that is hosting so many people packed in together.

However, after finally making it inside, it was great to see the ‘finished product’: the event halls were made up of far more ambitious shells and stands than I have seen at many events. The ‘Instagram effect’ in particular was really coming into its own. Mercedes-Benz, for example, had constructed an entire two-storey hub, designed to host workshops for start-ups at the event; Google had built a giant doll’s house of sorts to house their own mini-workshops. It’s clear that companies were happy to be investing money into this show, and it only takes one look at the exhibitor list to recognise that some of the world’s biggest brands were in attendance.

There was plenty going on, with each hall hosting at least two stages of talks alongside the hundreds of stands. It was like sensory overload: there was music erupting from everywhere – in one of the halls there was even a Radio station, complete with hosts talking live about the event. The Summit’s signature dubstep and purple-hued lighting permeated throughout. It felt more like a crowded bar than a tech event.

Undoubtedly, there were some gems within the Summit. The Women in Tech lounge provided some much-needed breathing space away from the pace of the bustling halls – and there were phone charging ports – hurrah! There were also plenty of interactive features built into the event, such as a gif-recording camera, and an AI machine that could tell you about your personality by answering 5 simple questions.

After spending the majority of my time in the main arena, The Forum was a different world entirely. An invite-only section of the show, the atmosphere was poles apart: no purple lights, no dubstep. The first thing that I noticed when arriving was that there was so much seating compared to the rest of the event. The entire room was set up to be inviting. There, the conversations I had with people were much more relaxed, as things are when happening over a glass of wine on a comfy seat. The room was mood-lit, and people were walking up to each other and connecting, rather than meeting in a shower of business cards with an agenda ready to push.

Overall, it was like two separate events. Perhaps Cosgrave and his team were also mindful that their ‘VIPs’ would need a safe haven. I spoke to two of them, who both shared the same message: ‘I honestly haven’t left the forum to attend the rest of the event’. It’s a shame, really, that aside from a few exceptions, these ‘tech Gods’ didn’t socialise with the attendees who had spent hundreds – if not thousands – on attending the event. I would like to note at this point, that my entry into the Forum was yet another security gaffe, and I was not an invitee – not that I’m complaining.

The quality of the talks in The Forum were also better for the majority. The fact that this was a closed-off room, with much less ambient noise was a big benefit – the Centre Stage in comparison, was too grand, the room was cold, noisy, and the camera, which projected speakers live onto the big screen, was pitched at a height where you could see people walking back and forth, impeding the view, and distracting from the talk. There was also much more exclusive, informative content being shared during the Forum sessions, compared to the underwhelming ‘Google-able’ info elsewhere.

All-in-all, I felt a bit hollow leaving the Summit. My attendance revolved around the talks, and I just didn’t get what I came for, outside my time spent in The Forum. I would argue that it wasn’t a tech conference at all, but a mash of questionably-web-related names – Wyclef Jean and Triple H, anyone?

I think much of the event experience would depend on what you were looking to get out of attending. If you were looking to listen to talks, see some thought-leading speakers, and find the opportunity to meet these people, this event was lacking – unless you were part of the very exclusive VIPs invited to The Forum. If you were there as a start-up, looking for investment, there were certainly investors at the event, looking for new opportunities. If you were looking for press, you had a chance – if you already have a name for yourself, that is. And, if you were looking for something to flesh out your Instagram feed, you got it.

New – Interactive AWS Cost Explorer API

We launched the AWS Cost Explorer a couple of years ago in order to allow you to track, allocate, and manage your AWS costs. The response to that launch, and to additions that we have made since then, has been very positive. However our customers are, as Jeff Bezos has said, “beautifully, wonderfully, dissatisfied.”

I see this first-hand every day. We launch something and that launch inspires our customers to ask for even more. For example, with many customers going all-in and moving large parts of their IT infrastructure to the AWS Cloud, we’ve had many requests for the raw data that feeds into the Cost Explorer. These customers want to programmatically explore their AWS costs, update ledgers and accounting systems with per-application and per-department costs, and to build high-level dashboards that summarize spending. Some of these customers have been going to the trouble of extracting the data from the charts and reports provided by Cost Explorer!

New Cost Explorer API
Today we are making the underlying data that feeds into Cost Explorer available programmatically. The new Cost Explorer API gives you a set of functions that allow you do everything that I described above. You can retrieve cost and usage data that is filtered and grouped across multiple dimensions (Service, Linked Account, tag, Availability Zone, and so forth), aggregated by day or by month. This gives you the power to start simple (total monthly costs) and to refine your requests to any desired level of detail (writes to DynamoDB tables that have been tagged as production) while getting responses in seconds.

Here are the operations:

GetCostAndUsage – Retrieve cost and usage metrics for a single account or all accounts (master accounts in an organization have access to all member accounts) with filtering and grouping.

GetDimensionValues – Retrieve available filter values for a specified filter over a specified period of time.

GetTags – Retrieve available tag keys and tag values over a specified period of time.

GetReservationUtilization – Retrieve EC2 Reserved Instance utilization over a specified period of time, with daily or monthly granularity plus filtering and grouping.

I believe that these functions, and the data that they return, will give you the ability to do some really interesting things that will give you better insights into your business. For example, you could tag the resources used to support individual marketing campaigns or development projects and then deep-dive into the costs to measure business value. You how have the potential to know, down to the penny, how much you spend on infrastructure for important events like Cyber Monday or Black Friday.

Things to Know
Here are a couple of things to keep in mind as you start to think about ways to make use of the API:

Grouping – The Cost Explorer web application provides you with one level of grouping; the APIs give you two. For example you could group costs or RI utilization by Service and then by Region.

Pagination – The functions can return very large amounts of data and follow the AWS-wide model for pagination by including a nextPageToken if additional data is available. You simply call the same function again, supplying the token, to move forward.

Regions – The service endpoint is in the US East (Northern Virginia) Region and returns usage data for all public AWS Regions.

Pricing – Each API call costs $0.01. To put this into perspective, let’s say you use this API to build a dashboard and it gets 1000 hits per month from your users. Your operating cost for the dashboard should be $10 or so; this is far less expensive than setting up your own systems to extract & ingest the data and respond to interactive queries.

The Cost Explorer API is available now and you can start using it today. To learn more, read about the Cost Explorer API.

Jeff;



from AWS News Blog http://ift.tt/2mNkO2o
via IFTTT

Amazon QuickSight Update – Geospatial Visualization, Private VPC Access, and More

We don’t often recognize or celebrate anniversaries at AWS. With nearly 100 services on our list, we’d be eating cake and drinking champagne several times a week. While that might sound like fun, we’d rather spend our working hours listening to customers and innovating. With that said, Amazon QuickSight has now been generally available for a little over a year and I would like to give you a quick update!

QuickSight in Action
Today, tens of thousands of customers (from startups to enterprises, in industries as varied as transportation, legal, mining, and healthcare) are using QuickSight to analyze and report on their business data.

Here are a couple of examples:

Gemini provides legal evidence procurement for California attorneys who represent injured workers. They have gone from creating custom reports and running one-off queries to creating and sharing dynamic QuickSight dashboards with drill-downs and filtering. QuickSight is used to track sales pipeline, measure order throughput, and to locate bottlenecks in the order processing pipeline.

Jivochat provides a real-time messaging platform to connect visitors to website owners. QuickSight lets them create and share interactive dashboards while also providing access to the underlying datasets. This has allowed them to move beyond the sharing of static spreadsheets, ensuring that everyone is looking at the same data and is empowered to make timely decisions based on current data.

Transfix is a tech-powered freight marketplace that matches loads and increases visibility into logistics for Fortune 500 shippers in retail, food and beverage, manufacturing, and other industries. QuickSight has made analytics accessible to both BI engineers and non-technical business users. They scrutinize key business and operational metrics including shipping routes, carrier efficient, and process automation.

Looking Back / Looking Ahead
The feedback on QuickSight has been incredibly helpful. Customers tell us that their employees are using QuickSight to connect to their data, perform analytics, and make high-velocity, data-driven decisions, all without setting up or running their own BI infrastructure. We love all of the feedback that we get, and use it to drive our roadmap, leading to the introduction of over 40 new features in just a year. Here’s a summary:

Looking forward, we are watching an interesting trend develop within our customer base. As these customers take a close look at how they analyze and report on data, they are realizing that a serverless approach offers some tangible benefits. They use Amazon Simple Storage Service (S3) as a data lake and query it using a combination of QuickSight and Amazon Athena, giving them agility and flexibility without static infrastructure. They also make great use of QuickSight’s dashboards feature, monitoring business results and operational metrics, then sharing their insights with hundreds of users. You can read Building a Serverless Analytics Solution for Cleaner Cities and review Serverless Big Data Analytics using Amazon Athena and Amazon QuickSight if you are interested in this approach.

New Features and Enhancements
We’re still doing our best to listen and to learn, and to make sure that QuickSight continues to meet your needs. I’m happy to announce that we are making seven big additions today:

Geospatial Visualization – You can now create geospatial visuals on geographical data sets.

Private VPC Access – You can now sign up to access a preview of a new feature that allows you to securely connect to data within VPCs or on-premises, without the need for public endpoints.

Flat Table Support – In addition to pivot tables, you can now use flat tables for tabular reporting. To learn more, read about Using Tabular Reports.

Calculated SPICE Fields – You can now perform run-time calculations on SPICE data as part of your analysis. Read Adding a Calculated Field to an Analysis for more information.

Wide Table Support – You can now use tables with up to 1000 columns.

Other Buckets – You can summarize the long tail of high-cardinality data into buckets, as described in Working with Visual Types in Amazon QuickSight.

HIPAA Compliance – You can now run HIPAA-compliant workloads on QuickSight.

Geospatial Visualization
Everyone seems to want this feature! You can now take data that contains a geographic identifier (country, city, state, or zip code) and create beautiful visualizations with just a few clicks. QuickSight will geocode the identifier that you supply, and can also accept lat/long map coordinates. You can use this feature to visualize sales by state, map stores to shipping destinations, and so forth. Here’s a sample visualization:

To learn more about this feature, read Using Geospatial Charts (Maps), and Adding Geospatial Data.

Private VPC Access Preview
If you have data in AWS (perhaps in Amazon Redshift, Amazon Relational Database Service (RDS), or on EC2) or on-premises in Teradata or SQL Server on servers without public connectivity, this feature is for you. Private VPC Access for QuickSight uses an Elastic Network Interface (ENI) for secure, private communication with data sources in a VPC. It also allows you to use AWS Direct Connect to create a secure, private link with your on-premises resources. Here’s what it looks like:

If you are ready to join the preview, you can sign up today.

Jeff;

 



from AWS News Blog http://ift.tt/2B7CvwN
via IFTTT

NEW PRODUCT – USB-A Male Plug to 5-pin Terminal Block

from NEW PRODUCT – USB-A Male Plug to 5-pin Terminal Block
by Angelica

3628 iso ORIG

NEW PRODUCT – USB-A Male Plug to 5-pin Terminal Block


This is the USB-A Male Plug to 5-pin Terminal Block. If you need to connect to a device with a USB Jack, maybe make your own USB stick or custom cable of sorts – this adapter will come in very handy! No soldering required. Just use a small screwdriver to open up the terminal blocks, slide in your stranded or solid-core wire, and re-tighten.

The terminal block itself is removable from the body. It’s more durable than soldering wires onto a connector and all the pins are labeled, which is really nice because we keep forgetting the order. The 5th pin is for the Sleeve of the connector.

3628 iso 02 ORIG3628 top separated ORIG3628 iso separated 01 ORIG3628 quarter ORIG

In stock and shipping now!

How To Model Wires in Fusion 360

from How To Model Wires in Fusion 360
by Noe Ruiz

Lets make some wires! Wiring up electronic components, do you need to model them? Here’s how I did, in fusion 360. Like screws, it helps find potential collisions and intersections. Get exact wire lengths via measurements.

Boombox Project:
http://ift.tt/2z3DP6p

Please consider supporting me and the amazing folks at Adafruit by shopping for parts for your own DIY projects http://ift.tt/1SKsp92

Download Fusion 360 FREE
http://autode.sk/1Ro3wkb

My CAD Tutorial Playlist:
https://www.youtube.com/playlist?list=PLjF7R1fz_OOVsMp6nKnpjsXSQ45nxfORb

Visit the Adafruit shop online – http://www.adafruit.com